Giter Site home page Giter Site logo

palm-api's Introduction

PaLM API Banner

Docs | GitHub | FAQ

Features

Highlights

palm-api v1.0 compared to Google's own API:

  • Fast: As fast as native API (also making it 4x faster than googlebard)
  • 🪶 Lightweight: 260x smaller minzipped size
  • 🚀 Simple & Easy: 2.8x less code needed

Table of Contents

Why PaLM API?

Google has its own API interface for PaLM, through their @google/generativelanguage and google-auth-library packages .

However, making requests with the native package is simply too complicated, clunky, and slow.

Here's the code you need for the Google API to ask PaLM something with context:

const { DiscussServiceClient } = require("@google-ai/generativelanguage");
const { GoogleAuth } = require("google-auth-library");

const client = new DiscussServiceClient({
	authClient: new GoogleAuth().fromAPIKey(process.env.API_KEY),
});

const result = await client.generateMessage({
	model: "models/chat-bison-001",
	prompt: {
		context: "Respond to all questions with a single number.",
		messages: [{ content: "How tall is the Eiffel Tower?" }],
	},
});

console.log(result[0].candidates[0].content);

And here's the equivalent code in palm-api:

import PaLM from "palm-api";

let bot = new PaLM(process.env.API_KEY);

bot.ask("How tall is the Eiffel Tower?", {
	context: "Respond to all questions with a single number.",
});

Yep! That's it... get the best features of PaLM, in a package that's simpler and easier to use, test, and maintain.

Statistics

Comparing against the Google API...

Size

PaLM API clocks in at just 1.3kb minzipped.

@google/generativelanguage and google-auth-library, the two required packages for Google's own implementation, clocks in at a total of (more or less) 337kb minzipped.

That makes PaLM API around 260 times smaller!

Code Needed

Using at the exact example I showed above, we are at around 2.8x less code needed for PaLM API, looking at characters.

Speed

Comparing the speed with the demo code on Google's own website, and equivalent code written in PaLM API, the times are virtually similar.

Tested with hyperfine.

Documentation

Setup

First, install PaLM API on NPM:

npm install palm-api

or PNPM:

pnpm add palm-api

Then, get yourself an API key in Makersuite here. Click on "Create API key in new project," and then simply copy the string.

Import PaLM API, then initialize the class with your API key.

Warning

It is recommended that you access your API from process.env or .env

import PaLM from "palm-api";

let bot = new PaLM(API_KEY, { ...config });

Config

Config Type Description
fetch function Fetch polyfill with same interface as native fetch. Optional

Note

PaLM itself and all of its methods have a config object that you can pass in as a secondary parameter. Example:

import PaLM from "palm-api";
import fetch from "node-fetch";
let bot = new PaLM(API_KEY, {
	fetch: fetch,
});

PaLM.ask()

Uses the generateMessage capable models to provide a high-quality LLM experience, with context, examples, and more.

Models available: chat-bison-001

Usage:

PaLM.ask(message, { ...config });

Config:

Learn more about model parameters here.

Config Type Description
model string Any model capable of generateMessage. Default: chat-bison-001.
candidate_count integer How many responses to generate. Default: 1
temperature float Temperature of model. Default: 0.7
top_p float top_p of model. Default: 0.95
top_k float top_k of model. Default: 40
format PaLM.FORMATS.MD or PaLM.FORMATS.JSON Return as JSON or Markdown. Default: PaLM.FORMATS.MD
context string Add context to your query. Optional
examples array of [example_input, example_output] Show PaLM how to respond. See examples below. Optional.

Example:

import PaLM from "palm-api";

let bot = new PaLM(API_KEY);

bot.ask("x^2+2x+1", {
	temperature: 0.5,
	candidateCount: 1,
	context: "Simplify the expression",
	examples: [
		["x^2-4", "(x-2)(x+2)"],
		["2x+2", "2(x+1)"],
		// ... etc
	],
});

JSON Response:

[
	{ content: string }, // Your message
	{ content: string }, // AI response
	// And so on in such pairs...
];

PaLM.generateText()

Uses the generateText capable models to let PaLM generate text.

Models available: text-bison-001

API:

PaLM.generateText(message, { ...config });

Config:

Learn more about model parameters here.

Config Type Description
model string Any model capable of generateText. Default: text-bison-001.
candidate_count integer How many responses to generate. Default: 1
temperature float Temperature of model. Default: 0
top_p float top_p of model. Default: 0.95
top_k float top_k of model. Default: 40
format PaLM.FORMATS.MD or PaLM.FORMATS.JSON Return as JSON or Markdown. Default: PaLM.FORMATS.MD

Example:

import PaLM from "palm-api";

let bot = new PaLM(API_KEY);

bot.generateText("Write a poem on puppies.", {
	temperature: 0.5,
	candidateCount: 1,
});

JSON Response:

See more about safety ratings here.

[
	{
		output: output,
		safetyRatings: [
			HARM_CATEGORY_UNSPECIFIED: rating,
			HARM_CATEGORY_DEROGATORY: rating,
			HARM_CATEGORY_TOXICITY: rating,
			HARM_CATEGORY_VIOLENCE: rating,
			HARM_CATEGORY_SEXUAL: rating,
			HARM_CATEGORY_DANGEROUS: rating,
		]
	},
	// More candidates (if asked for)...
];

PaLM.embed()

Uses PaLM to embed your text into a float matrix with embedText enabled models, that you can use for various complex tasks.

Models available: embedding-gecko-001

Usage:

PaLM.embed(message, { ...config });

Config:

Config Type Description
model string Any model capable of embedText. Default: embedding-gecko-001.

Example:

import PaLM from "palm-api";

let bot = new PaLM(API_KEY);

bot.embed("Hello, world!", {
	model: "embedding-gecko-001",
});

JSON Response:

[...embeddingMatrix];

PaLM.createChat()

Uses generateMessage capable models to create a chat interface that's simple, fast, and easy to use.

Usage:

let chat = PaLM.createChat({ ...config });

chat.ask(message, { ...config });

chat.export();

The ask method on Chat remembers previous messages and responses, so you can have a continued conversation.

Basic steps to use import/export chats:

  1. Create an instance of Chat with PaLM.createChat()
  2. Use Chat.ask() to query PaLM
  3. Use Chat.export() to export your messages and PaLM responses
  4. Import your messages with the messages config with PaLM.createChat({messages: exportedMessages})

Info You can actually change the messages exported, and PaLM in your new chat instance will adapt to the "edited history." Use this to your advantage!

Config for createChat():

Learn more about model parameters here. All configuration associated with Chat.ask() except the format is set in the config for createChat().

Config Type Description
messages array Exported messages from previous Chats. Optional.
model string Any model capable of generateMessage. Default: chat-bison-001.
candidate_count integer How many responses to generate. Default: 1
temperature float Temperature of model. Default: 0.7
top_p float top_p of model. Default: 0.95
top_k float top_k of model. Default: 40
context string Add context to your query. Optional
examples array of [example_input, example_output] Show PaLM how to respond. See examples below. Optional.

Config for Chat.ask():

Config Type Description
format PaLM.FORMATS.MD or PaLM.FORMATS.JSON Return as JSON or Markdown. Default: PaLM.FORMATS.MD

Example:

import PaLM from "palm-api";

let bot = new PaLM(API_KEY);

let chat = PaLM.createChat({
	temperature: 0,
	context: "Respond like Shakespeare",
});

chat.ask("What is 1+1?");
chat.ask("What do you get if you add 1 to that?");

The response for Chat.ask() is exactly the same as PaLM.ask(). In fact,they use the same query function under-the-hood.

Frequently Asked Questions

"Why can the model not access the internet like Bard?"

PaLM is only a Language Model. Google Bard is able to search the Internet because it is performing additional searches on Google, and feeding the results back into PaLM. Thus, if you do want to mimic the web-search behavior, you need to implement it yourself. (Or just use bard-ai for a Google Bard API!)

"fetch is undefined" or "I can't use default fetch"

PaLM API uses the experimental fetch function in Node.js. It is enabled by default now, but there are still environments in which it is undefined, or disabled. You may also be here because you need to use a special fetch for your development environment. Because of this, PaLM API comes with a way to Polyfill fetch—built in. Just pass in your custom fetch function into the config for the PaLM class.

import PaLM from "palm-api";
import fetch from "node-fetch";

let bot = new PaLM(API_KEY, {
	fetch: fetch,
});

It's that easy! Just ensure your polyfill has the exact same API as the default Node.js/browser fetch or it is not guarenteed to work.

"Cannot require of a ES6 module"

PaLM API, as per today's standards, is a strictly ES6 module (that means it uses the new import/export syntax). Because of this, you have two options if you are still using require/CommonJS Modules:

  1. Migrate to ESM yourself! It will be beneficial for you in the future.
  2. Use a dynamic import.

Contributors

A special shoutout to developers and contributors of the bard-ai library. The PaLM API interface is basically an exact port of the bard-ai interface.

Additionally, huge thank-you to @Nyphet for converting the library to TypeScript!

However, we thank every person that helps in the development process of this library, no matter that be in code, ideas, or anything else.

palm-api's People

Contributors

evanzhoudev avatar crystal-spider avatar

Stargazers

MR.Fan avatar Łukasz Stępień avatar Strong Heart avatar Mathew Agustin Bella avatar Misha Inozemtsev avatar Itsfitts avatar Chuong Ho avatar  avatar Duoc Nguyen avatar Goodbye avatar Dhurv Paste avatar Arta Mo avatar Sergei Vysotskii avatar Mased avatar  avatar Coolshan avatar Marwan Radwan avatar  avatar Roel Van Gils avatar Nikos Vaggalis avatar ThatXliner avatar Tianle Cai avatar L avatar Ilya avatar  avatar J. Q. avatar Zafar Ansari avatar Thomas Claburn avatar  avatar

Watchers

 avatar  avatar Mathew Agustin Bella avatar

Forkers

isometricshahil

palm-api's Issues

TypeError: Found non-callable @@iterator

Error:
Scenario - while using createChat method as shown in documentation.

Issue: - TypeError: Found non-callable @@iterator

The createChat() method in the PaLM API is expecting an object as argument, not an iterable. So, there seems to be a mistake in the implementation of the createChat method in the version of the PaLM API you're using. It's trying to spread rawConfig as if it was an iterable, but it's actually an object.

possible way to temporarily fix this :
by passing an array
image

No inputs found in config file

Hi, it's me again, I found out a problem with my TypeScript port that I couldn't have caught before having my hands on the real published package.
Luckily the fix should be very easy, it's just needed to exclude the tsconfig.json file from the published package. To do so there should be two ways:

  1. Add a "files" entry in your package.json like "files": "out"
  2. Add a .npmignore file and inside it write tsconfig.json

Personally I'd go with option 1.

Thank you again!

got rejected on ask

Error: Request rejected. Got [object Object] instead of response.
    at Chat.ask (file:///root/hisoka-baileys/node_modules/palm-api/out/index.js:269:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Module.Message (file:///root/hisoka-baileys/event/message.js?v=1697783348701:798:35)
    at async EventEmitter.<anonymous> (file:///root/hisoka-baileys/hisoka.js:250:7)

TypeScript support

I'm trying to use this API in a TypeScript strict project.

I try to import PaLM by doing import {PaLM} from 'palm-api';, however this gives the following error:

Could not find a declaration file for module 'palm-api'. '.../node_modules/palm-api/index.js' implicitly has an 'any' type.
Try npm i --save-dev @types/palm-api if it exists or add a new declaration (.d.ts) file containing declare module 'palm-api'; ts(7016)

The command npm i --save-dev @types/palm-api gives E404.

Would it be possible to add support for TypeScript?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.