Skip to main content

Konko

This page covers how to run models on Konko within LangChain.

Konko API is a fully managed API designed to help application developers:

Select the right LLM(s) for their application Prototype with various open-source and proprietary LLMs Move to production in-line with their security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant infrastructure

Installation and Setup​

First you'll need an API key​

You can request it by messaging support@konko.ai

Install Konko AI's Python SDK​

1. Enable a Python3.8+ environment​

2. Set API Keys​

Option 1: Set Environment Variables​
  1. You can set environment variables for

    1. KONKO_API_KEY (Required)
    2. OPENAI_API_KEY (Optional)
  2. In your current shell session, use the export command:

export KONKO_API_KEY={your_KONKO_API_KEY_here}
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional

Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.

Option 2: Set API Keys Programmatically​

If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:

konko.set_api_key('your_KONKO_API_KEY_here')  
konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional

3. Install the SDK​

pip install konko

4. Verify Installation & Authentication​

#Confirm konko has installed successfully
import konko
#Confirm API keys from Konko and OpenAI are set properly
konko.Model.list()

Calling a model​

Find a model on the Konko Introduction page

Another way to find the list of models running on the Konko instance is through this endpoint.

Examples of Endpoint Usage​

  • ChatCompletion with Mistral-7B:

    chat_instance = ChatKonko(max_tokens=10, model = 'mistralai/mistral-7b-instruct-v0.1')
    msg = HumanMessage(content="Hi")
    chat_response = chat_instance([msg])

  • Completion with mistralai/Mistral-7B-v0.1:

    from langchain.llms import Konko
    llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
    prompt = "Generate a Product Description for Apple Iphone 15"
    response = llm(prompt)

For further assistance, contact support@konko.ai or join our Discord.