Giter Site home page Giter Site logo

googleforgames / genai-quickstart Goto Github PK

View Code? Open in Web Editor NEW
18.0 18.0 13.0 1.19 MB

Google for Games Generative AI Quickstart

License: Apache License 2.0

Shell 0.62% Python 83.98% Dockerfile 6.91% HCL 8.49%
games generative-ai google google-cloud

genai-quickstart's People

Contributors

dependabot[bot] avatar ggiovanejr avatar igooch avatar mbychkowski avatar smithpatrick12 avatar thisisnotapril avatar zaratsian avatar zmerlynn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

genai-quickstart's Issues

Add Message History Parameter into the Vertex Chat API

As is stands right now we don't have a way of passing the "messages" field with the chat history to the Vertex chat API.

def vertex_llm_chat(payload: Payload_Vertex_Chat):
try:
request_payload = {
'prompt': payload.prompt,
'context': payload.context,
'max_output_tokens': payload.max_output_tokens,
'temperature': payload.temperature,
'top_p': payload.top_p,
'top_k': payload.top_k,
}
response = model_vertex_llm_chat.call_llm(**request_payload)
return response.text

The current workaround is to pass the message history through as context.

if 'chatHistory' in request_payload:
context += f"\nHere the chat history for {request_payload['characterContext']} that can be used when answering questions:\n"
seen_chat = []
for chat in request_payload['chatHistory']:
if chat not in seen_chat:
if chat["sender"].upper() in ['USER', request_payload['characterName'].upper()]:
context += f'{chat["sender"].upper()}: {chat["message"]}\n'
seen_chat.append(chat)
payload = {
'prompt': f'''{request_payload["message"]}''',
'context': context,
}

Based on manual testing I've done with one Vertex Chat API endpoint "chatting" to another Vertex Chat API endpoint, this results in the chat quickly going into a loop.

The most straightforward way to fix this is to add "messages" as an optional field to the Vertex Chat API module, so that the message history does not need to be sent as part of the context. This is in keeping with the Vertex Text Chat schema.

In the future the API should be updated to be endpoint agnostic, and use the OpenAI chat completion schema

Os Environment Variables and LLM Calls Should be Mocked for Tests

What happened?

When running pytest -v in a local virtualenv on the test_main.py files:

  • genai_api/src/test_main.py fails when running pytest because the OS environment variables are not in place.
  • api/vertex_*/src/test_main.py fail when running pytest because there is no mock for Google_Cloud_GenAI, so the test tries to connect to a real project and call a LLM in that project.
  • api/vertex_*/src/test_main.py do not have the correct expected_response since they return a single text string response.text and not a json object {'mocked_key': 'mocked_value'}.

What you expected to happen:

Running the unit tests should not require setting up environment variables or calls to an external service.

How to reproduce it (as minimally and precisely as possible):

For the genai_api test:

  1. Create and activate a python virtual environment
  2. Run pytest in (myvirtualenv) me@me:~/GenAI-quickstart/genai/api/genai_api/src$ pytest -v

For example on the vertex_chat_api test:

  1. Create and activate a python virtual environment
  2. Run pytest in (myvirtualenv) me@me:~/GenAI-quickstart/genai/api/vertex_chat_api/src$ pytest -v (you'll notice this fails with warning that the 403 Vertex AI API has not been used in project)
  3. Change project_id_response.text to the name of your Google Cloud Project.
    project_id = project_id_response.text if project_id_response.status_code == 200 else "Unavailable"
  4. Run pytest again in (myvirtualenv) me@me:~/GenAI-quickstart/genai/api/vertex_chat_api/src$ pytest -v. Now the test fails with AssertionError: assert 'test response' == {'mocked_key': 'mocked_value'}.

Anything else we need to know?:

The value test response is coming from the actual call to the LLM. If you run the GenAI Quickstart cluster and navigate to the http://${EXT_IP}/genai_docs (following instructions on the main readme) and enter the same payload prompt as the test "prompt": "test prompt", the response is test response.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.