{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "view-in-github", "colab_type": "text" }, "source": [ "\"Open" ] }, { "cell_type": "markdown", "metadata": { "id": "wNCGmSXbfZRr" }, "source": [ "### w-okada's Voice Changer version 2.x | **Google Colab**\n", "\n", "## READ ME - VERY IMPORTANT\n", "This is an attempt to run [Realtime Voice Changer](https://github.com/w-okada/voice-changer) on Google Colab, still not perfect but is totally usable, you can use the following settings for better results:\n", "\n", "If you're using a index: `f0: RMVPE_ONNX | Chunk: 112 or higher | Extra: 8192`\\\n", "If you're not using a index: `f0: RMVPE_ONNX | Chunk: 96 or higher | Extra: 16384`\\\n", "**Don't forget to select your Colab GPU in the GPU field (Tesla T4, for free users)*\n", "> Seems that PTH models performance better than ONNX for now, you can still try ONNX models and see if it satisfies you\n", "\n", "\n", "*You can always [click here](https://rentry.co/VoiceChangerGuide#gpu-chart-for-known-working-chunkextra\n", ") to check if these settings are up-to-date*\n", "

\n", "\n", "---\n", "\n", "###Always use Colab GPU (**VERY VERY VERY IMPORTANT!**)\n", "You need to use a Colab GPU so the Voice Changer can work faster and better\\\n", "Use the menu above and click on **Runtime** » **Change runtime** » **Hardware acceleration** to select a GPU (**T4 is the free one**)\n", "\n", "---\n", "\n", "\n", "# **Credits and Support**\n", "Realtime Voice Changer by [w-okada](https://github.com/w-okada)\\\n", "Colab files updated by [rafacasari](https://github.com/Rafacasari)\\\n", "Recommended settings by [Raven](https://github.com/ravencutie21)\\\n", "Modified again by [Hina](https://github.com/HinaBl)\\\n", "Enable FCPE by [TheTrustedComputer](https://github.com/TheTrustedComputer)\n", "\n", "Need help? [AI Hub Discord](https://discord.gg/aihub) » ***#help-realtime-vc***\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "W2GYWTHWmRIY", "cellView": "form" }, "outputs": [], "source": [ "#=================Updated=================\n", "# @title **[1]** Clone repository and install dependencies\n", "# @markdown This first step will download the latest version of Voice Changer and install the dependencies. **It can take some time to complete.**\n", "\n", "#@markdown ---\n", "# @title **[Optional]** Connect to Google Drive\n", "# @markdown Using Google Drive will automatically save your uploaded models for later use.\n", "\n", "\n", "import os\n", "import time\n", "import subprocess\n", "import threading\n", "import shutil\n", "import base64\n", "import codecs\n", "\n", "# Configs\n", "Run_Cell=0\n", "Use_Drive=True #@param {type:\"boolean\"}\n", "\n", "notebook_env=0\n", "if os.path.exists('/content'):\n", " notebook_env=1\n", " print(\"Welcome to ColabMod\")\n", " from google.colab import drive\n", "\n", "elif os.path.exists('/kaggle/working'):\n", " notebook_env=2\n", " print(\"Welcome to Kaggle Mod\")\n", "else:\n", " notebook_env=3\n", " print(\"Welcome!\")\n", "\n", "from IPython.display import clear_output, Javascript\n", "\n", "print(\"Installing libportaudio2... \",end=\"\")\n", "!sudo apt-get install -y libportaudio2 > /dev/null 2>&1\n", "!pip install pyngrok > /dev/null 2>&1\n", "print(\"done.\")\n", "\n", "if notebook_env==1 and Use_Drive==True:\n", " work_dir = \"/content/drive/MyDrive/vcclient\"\n", " if not os.path.exists('/content/drive'):\n", " drive.mount('/content/drive')\n", "\n", " if not os.path.exists(work_dir):\n", " !mkdir -p {work_dir}\n", "\n", " print(\"Checking latest version...\")\n", " if os.path.exists(f'{work_dir}/latest_hash.txt'):\n", " current_version_hash = open(f'{work_dir}/latest_hash.txt').read().strip()\n", " else:\n", " current_version_hash = None\n", "\n", " !curl -s -L https://huggingface.co/wok000/vcclient000_colab/resolve/main/latest_hash.txt -o latest_hash.txt\n", " latest_version_hash = open('latest_hash.txt').read().strip()\n", "\n", " print(f\"current_version_hash: {current_version_hash}\")\n", " print(f\"latest_version_hash : {latest_version_hash}\")\n", "\n", " if current_version_hash != latest_version_hash:\n", " print(f\"hash not match -> download latest version\")\n", "\n", " latest_hash_path=f'{work_dir}/latest_hash.txt'\n", " !curl -L https://huggingface.co/wok000/vcclient000_colab/resolve/main/vcclient_latest_for_colab -o {work_dir}/vcclient_latest_for_colab\n", " !cp latest_hash.txt {latest_hash_path}\n", " else:\n", " print(\"hash matched. skip download\")\n", "\n", "else:\n", " work_dir = \"/content\"\n", " print(\"Downloading the latest vcclient... \")\n", " !curl -L https://huggingface.co/wok000/vcclient000_colab/resolve/main/vcclient_latest_for_colab -o {work_dir}/vcclient_latest_for_colab\n", " print(\"done.\")\n", "\n", "%cd {work_dir}\n", "!chmod 0700 vcclient_latest_for_colab\n", "!ls -lha vcclient_latest_for_colab\n", "\n", "Run_Cell=1\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Dbx41M-zlknc", "cellView": "form" }, "outputs": [], "source": [ "PORT=8000\n", "\n", "import codecs\n", "\n", "# @title **[2]** Start ngrock\n", "# @markdown This cell will start the server, the first time that you run it will download the models, so it can take a while (~1-2 minutes)\n", "\n", "# @markdown ---\n", "# @markdown You'll need a ngrok account, but **it's free** and easy to create!\n", "# @markdown ---\n", "# @markdown **1** - Create a **free** account at [ngrok](https://dashboard.ngrok.com/signup) or **login with Google/Github account**\\\n", "# @markdown **2** - If you didn't logged in with Google/Github, you will need to **verify your e-mail**!\\\n", "# @markdown **3** - Click [this link](https://dashboard.ngrok.com/get-started/your-authtoken) to get your auth token, and place it here:\n", "Token = 'TOKEN' # @param {type:\"string\"}\n", "# @markdown **4** - *(optional)* Change to a region near to you or keep at United States if increase latency\\\n", "# @markdown `Default Region: ap - Asia/Pacific (Singapore)`\n", "Region = \"jp - Japan (Tokyo)\" # @param [\"ap - Asia/Pacific (Singapore)\", \"au - Australia (Sydney)\",\"eu - Europe (Frankfurt)\", \"in - India (Mumbai)\",\"jp - Japan (Tokyo)\",\"sa - South America (Sao Paulo)\", \"us - United States (Ohio)\"]\n", "\n", "#@markdown **5** - *(optional)* Other options:\n", "ClearConsole = True # @param {type:\"boolean\"}\n", "Play_Notification = False # @param {type:\"boolean\"}\n", "\n", "# ---------------------------------\n", "# DO NOT TOUCH ANYTHING DOWN BELOW!\n", "# ---------------------------------\n", "\n", "# Check if Run_Cell\n", "if 'Run_Cell' not in globals():\n", " print(\"No, Go back to the first cell and run it\")\n", " exit\n", "if Run_Cell == 0:\n", " print(\"No, Go back to the first cell and run it\")\n", " exit\n", "\n", "\n", "from pyngrok import conf, ngrok\n", "MyConfig = conf.PyngrokConfig()\n", "MyConfig.auth_token = Token\n", "MyConfig.region = Region[0:2]\n", "conf.set_default(MyConfig);\n", "\n", "from pyngrok import ngrok\n", "ngrokConnection = ngrok.connect(PORT)\n", "public_url = ngrokConnection.public_url\n", "print(f\"your app's url is {public_url}\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "s7mYqKtW6VOI", "cellView": "form" }, "outputs": [], "source": [ "# @title **[3]** Start server\n", "# @markdown This cell will start the server, the first time that you run it will download the models, so it can take a while (~1-2 minutes)\n", "\n", "# @markdown When you see the message \"running...\", please launch the application from the link above.\n", "!LD_LIBRARY_PATH=/usr/lib64-nvidia:/usr/lib/x86_64-linux-gnu ./vcclient_latest_for_colab cui --port {PORT} --no_cui true\n", "\n" ] } ], "metadata": { "accelerator": "GPU", "colab": { "gpuType": "T4", "provenance": [], "authorship_tag": "ABX9TyMfx2JSTFXzmmUSOvSzbfTf", "include_colab_link": true }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "name": "python" } }, "nbformat": 4, "nbformat_minor": 0 }