Get New Articles by Email

Gorilla is an AI model developed by Microsoft and UC Berkeley that can autonomously interact with various online tools using API calls.

I’m excited that it’s open source!

Try it yourself at:

🦍 Gorilla: Large Language Model Connected with Massive APIs

Their site is a bit finicky right now, probably can’t handle the traffic, so here’s a link to the example notebook. Just open it up, save it to your personal Drive account, and run all the cells.

Here’s a video introduction:

I started testing it in Colab by simply replicating one of the example prompts. Here’s how that looked:

prompt = "I want to build a bot that will answer any question asked by users."

print(get_gorilla_response(prompt, model="gorilla-7b-hf-v1"))


The results were a list of steps with code:

<<<domain>>>: Natural Language Processing Conversational
<<<api_call>>>: model = AutoModelForSeq2SeqLM.from_pretrained('facebook/blenderbot-1B-distill')
<<<api_provider>>>: Hugging Face Transformers
<<<explanation>>>: 
1. Import required libraries from Hugging Face Transformers.
2. Initialize the tokenizer and model with the 'facebook/blenderbot-1B-distill' checkpoint.
3. Define a function to generate a response given an input, and another function to decode the input and return the generated response as a string.
4. Call the generate_response function with an input message, and pass the response to the corresponding function to decode the output.
<<<code>>>:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

def load_model():    
     tokenizer = AutoTokenizer.from_pretrained('facebook/blenderbot-1B-distill')    
     model = AutoModelForSeq2SeqLM.from_pretrained('facebook/blenderbot-1B-distill')    
     return tokenizer, model

def process_data(input_message, tokenizer, model):    
     inputs = tokenizer(input_message, return_tensors="pt")    
     outputs = model.generate(inputs["input_ids"])    
     response = tokenizer.decode(outputs[0], skip_special_tokens=True)    
     return response

input_message = "What are the key differences between renewable and non-renewable energy sources?"

# Load the model and tokenizer
tokenizer, model = load_model()

# Process the data
response = process_data(input_message, tokenizer, model)

print(response)

I copied that code into a new cell and ran it. Sure enough, it worked!

The bot that was generated was no ChatGPT, but I certainly wouldn’t expect that.

The code from my second prompt used up the notebook’s RAM and crashed the session, so I’ll have to try running it locally.

So far, Gorilla AI is freaking awesome!

Check it out to see what you can do with it!

I’ll keep exploring and will share anything worthwhile.

Get New Articles by Email

Leave a Reply

Verified by MonsterInsights