No, you can just run with the default settings and our current dataset in the project to test. If you want you can also pick your own dataset but you will need to tweak some setting see this tutorial for more info.
Make sure to check for the prerequisites before installing the extension. More details at Prerequisites.
If you have the NVIDIA GPU device but the prerequisites check fails with "GPU is not detected", make sure that the latest driver is installed. You can check and download the driver at NVIDIA site. Also, make sure that it is installed in the path. To check, run run nvidia-smi from the command line.
There might have been an issue setting the environment you can manually initialize the environment using bash /mnt/[PROJECT_PATH]/setup/first_time_setup.sh
from inside the workspace.
Make sure before you start the python finetuning/invoke_olive.py
command you run huggingface-cli login
this will ensure the dataset can be downloaded on your behalf.
Not at this time but we are working to expand the list of models.
At this time we only support running the extension in Windows and Linux but we are currently planning for other platform support. The extension uses WSL but will not run within the environment.
At this time Azure GPU VMs do not support nested virtualization which is needed to run the WSL environment this will prevent the extension from working.
To disable the conda install in WSL you can run conda config --set auto_activate_base false
this will disable the base environment.
We are currently working on the container support and it will be enable in a future release. We currently use WSL as the location to run the pipeline and we install the pre-requisites there for you.
We host all the project templates in GitHub and the base models are hosted in Azure or Hugging Face which requires accounts to get access to them from the APIs.
Please ensure you request access to Llama through this form Llama 2 sign up page this is needed to comply with Meta's trade compliance.
Because the remote sessions are currently not supported when running the AI Toolkit Actions, you cannot save your project while being connected to WSL. To close remote connections, click on "WSL" at the bottom left of the screen and choose "Close Remote Connections".
We host the project templates in GitHub repositry microsoft/windows-ai-studio-templates, and the extension will call GitHub API to load the repo content. If you are in Microsoft, you may need to authorize Microsoft organization to avoid such forbidden issue.
See this issue for workaround. The detailed steps are:
- Sign out GitHub account from VS Code
- Reload VS Code and AI Toolkit and you will be asked to sign in GitHub again
- [Important] In browser's authorize page, make sure to authorize the app to access "Microsoft" org
Check the 'AI Toolkit' log from output panel. If seeing Agent error or something like:
Please close all VS Code instances and reopen VS Code.
(It's caused by underlying ONNX agent unexpectedly closed and above step is to restart the agent.)