Installing and Accessing Ollama with Llama2 on Google Cloud VM (External Access)
Installing and Accessing Ollama with Llama2 on Google Cloud VM (External Access)
This guide provides a comprehensive step-by-step process for installing Ollama, running the Llama2 model, and making it accessible from your external PC using a Google Cloud VM.
Prerequisites
- A Google Cloud Account
- Basic familiarity with the Linux command line.
Step 1: Create a Google Cloud VM Instance (Ubuntu)
- Go to the Google Cloud Console.
- Navigate to "Compute Engine" -> "VM instances."
- Click "Create Instance."
- Name: Provide a name for your VM instance.
- Region and Zone: Choose a region and zone that's geographically close to you.
- Machine Configuration:
- Select a machine type that meets your needs. For running Llama2, consider a machine with sufficient CPU and RAM.
- Boot Disk:
- Click "Change."
- Select "Ubuntu" as the operating system.
- Choose a version of Ubuntu (e.g., Ubuntu 22.04 LTS).
- Select a boot disk size (e.g., 50 GB).
- Click "Select."
- Networking:
- Click "Networking, disks, security, management, sole-tenancy."
- Navigate to the "Networking" tab
- Under "Network interfaces," click the network interface.
- Under "External IPv4 address," select "Create IP address."
- Give the IP address a name, then click "Reserve".
- Click "Done".
- Navigate to the "Security" tab
- Under "Firewall", check "Allow custom traffic"
- In "Protocols and ports", enter
tcp:11434
- Click "Create."
Step 2: Connect to Your Google Cloud VM
- In the Google Cloud Console, navigate to "Compute Engine" -> "VM instances."
- Find your VM instance.
- Click the "SSH" button on the right-hand side of the VM instance row. This will open an SSH connection directly in your browser.
Step 3: Update Packages and Install Dependencies
Update the package lists:
sudo apt-get update
Install required dependencies (if not already installed):
sudo apt-get install -y curl git net-tools
curl
: For downloading files.git
: For version control (may be needed for some installations).net-tools
: For thenetstat
command (or usess
as an alternative).
Step 4: Install Ollama
Download and run the Ollama installation script:
curl -fsSL https://ollama.com/install.sh | sudo bash
- Note: Always verify the official Ollama documentation for the latest installation instructions.
Step 5: Run Ollama and Llama2
Start the Ollama server, binding it to all interfaces:
OLLAMA_HOST=0.0.0.0 ollama serve
In a separate terminal window, or use
tmux
orscreen
to keep the server running, pull the Llama2 model:ollama pull llama2
Step 6: Verify Ollama is Running
Check if Ollama is listening on port 11434:
sudo netstat -tuln | grep 11434
- Confirm the output shows
0.0.0.0:11434
. Or confirm that the output shows both0.0.0.0:11434
and[::]:11434
.
- Confirm the output shows
Step 7: Test from External PC
Open a terminal or command prompt on your external PC.
Use
curl
to send a request to your VM's external IP:curl -X POST http://your_vm_external_ip:11434/api/generate -H "Content-Type: application/json" -d '{"model": "llama2", "prompt": "Hello from external PC."}'
Replace
your_vm_external_ip
with your VM's actual external IP address.
Step 8: Verify the Response
- If successful, you'll see a JSON response with generated text.
- If you encounter errors:
- "Connection refused": Double-check IP, port, firewall, and Ollama status.
- "404 Not Found": Verify the API endpoint and model loading.
- "address already in use": Identify and terminate the conflicting process.
- Check the ollama logs on the VM.
Troubleshooting Tips
- VM Restart: If issues persist, try restarting your VM.
- Ollama Reinstall: If all else fails, reinstall Ollama.
- Network Logs: Check the VM's network logs for clues.
- Test Locally: Run the
curl
command locally on the VM to isolate network issues. - System Logs: Check the system logs for network-related errors.
- Ollama service: Check if the ollama service is running, and if it is, stop it.
- Check the correct api endpoint: Ensure that you are using the correct api endpoint.
- Ollama Check status: sudo systemctl status ollama
- Ollama Stop: sudo systemctl stop ollama
Comments
Post a Comment