Giter Site home page Giter Site logo

Comments (11)

lionelrudaz avatar lionelrudaz commented on June 12, 2024 1

Solved the issue by adding "use client" to the component that's rendered in Generative UI and moving

export const dynamic = 'force-dynamic';
export const maxDuration = 60;

To the very top of my page.tsx file.

I hope this helps.

from ai.

valstu avatar valstu commented on June 12, 2024 1

Okay, adding this to layout.tsx

export const dynamic = 'force-dynamic'

Seemed to work for me.

from ai.

lgrammel avatar lgrammel commented on June 12, 2024

@threefoldo can you check if this solves the issue for you? https://sdk.vercel.ai/docs/troubleshooting/common-issues/streaming-not-working-in-production

from ai.

threefoldo avatar threefoldo commented on June 12, 2024

Thanks for the advice. I add the code at the beginning of the page.tsx file, but it doesn't work.

"use client";

export const dynamic = 'force-dynamic';

The client component:

 // app/your-route/page.js

export const dynamic = 'force-dynamic';

import { someStreamingFunction } from './someStreamingFunction';

async function YourComponent() {
  const data = await someStreamingFunction();

  return (
    <div>
      {/* Your component JSX */}
    </div>
  );
}

export default YourComponent;

The server action:

// app/your-route/someStreamingFunction.js

export async function someStreamingFunction() {
  'use server';

  // Your server-side streaming logic here
  // ...

  return data;
}

Do I need to wrap the server action in a POST API?

from ai.

lgrammel avatar lgrammel commented on June 12, 2024

@threefoldo do you have a github repo with a reproduction that you could share?

from ai.

lionelrudaz avatar lionelrudaz commented on June 12, 2024

I'm running into a similar situation, but not with streamObject, with streamText and when the tool generates the UI.

I have the following error:

7023-4c5b21469a654b50.js:1 TypeError: Cannot destructure property 'children' of 'e' as it is null.
    at l (4168-9753ef37215827ad.js:1:12099)
    at rE (fd9d1056-3a3acf3c4e3d999c.js:1:40344)
    at l$ (fd9d1056-3a3acf3c4e3d999c.js:1:59319)
    at iZ (fd9d1056-3a3acf3c4e3d999c.js:1:117682)
    at ia (fd9d1056-3a3acf3c4e3d999c.js:1:95165)
    at fd9d1056-3a3acf3c4e3d999c.js:1:94987
    at il (fd9d1056-3a3acf3c4e3d999c.js:1:94994)
    at oJ (fd9d1056-3a3acf3c4e3d999c.js:1:92350)
    at oZ (fd9d1056-3a3acf3c4e3d999c.js:1:91769)
    at MessagePort.T (7023-4c5b21469a654b50.js:1:84042)

Here's a video showing the issue. It's reproduced on Preview and Production environments.

CleanShot.2024-05-09.at.09.55.36.mp4

I've checked and it looks like the component that's supposed to be injected into is null. But it works perfectly on local machine.

I've added this in my page.tsx file:

export const dynamic = 'force-dynamic';
export const maxDuration = 60;

Doesn't work.

I've tried to add <></> as a default value of children for , doesn't work. I've tried to add a paragraph in addition to my component in , doesn't work.

I've updated the ai library to 3.1.3 as well.

I have no idea what's wrong, it's really frustrating. Maybe this is connected to your problem, I don't know. If not, then I'll open a separate issue.

My actions.tsx file:

import 'server-only';

import {
  createAI,
  createStreamableUI,
  getMutableAIState,
  getAIState,
  createStreamableValue
} from 'ai/rsc';

import { openai } from '@ai-sdk/openai';
import WINE_MATCHING_PROMPT from 'lib/prompts/wineMatchExplanationAndSearch';
import { nanoid } from '../utils';
import { BotCard, BotMessage, SpinnerMessage, UserMessage } from '@/components/llm/Messages';
import WineSearchResultsCarrousel from '@/components/wine/WineSearchResultsCarrousel';
import { streamText } from 'ai';
import { searchWinesFilterSchema } from '@/interfaces/zodSchemas/searchWinesFilters';
import { searchWines } from 'app/actions/wines';
import { saveMatch } from 'app/actions/matches';

async function submitUserMessage(content: string) {
  'use server';
  //await rateLimit() // Rate limit the function to avoid abuse

  const aiState = getMutableAIState();

  aiState.update({
    ...aiState.get(),
    messages: [
      ...aiState.get().messages,
      {
        id: nanoid(),
        role: 'user',
        content: `${aiState.get().interactions.join('\n\n')}\n\n${content}`
      }
    ]
  })

  const matchId = aiState.get().matchId;

  const history = aiState.get().messages.map(message => ({
    role: message.role,
    content: message.content
  }))

  let textContent = ''
  let searchWinesArgs;
  let foundWines;

  const textStream = createStreamableValue('')
  const spinnerStream = createStreamableUI(<SpinnerMessage />)
  const messageStream = createStreamableUI(null)
  const uiStream = createStreamableUI()

  ;(async () => {
    try {
      const result = await streamText({
        //model: google.generativeAI('models/gemini-1.0-pro-001'),
        model: openai.chat("gpt-4-turbo"),
        temperature: 0,
        tools: {
          searchWines: {
            description: 'Search for wines with color, price range, grapes, country, region and appellation.',
            parameters: searchWinesFilterSchema,
            execute: async (args) => {
              console.log("Execute search wines", args);
              const streamResults = createStreamableValue<string>()
              const wines = await searchWines(args);
              foundWines = wines;
              uiStream.append(<WineSearchResultsCarrousel wines={wines} args={args} matchId={matchId} />)

              streamResults.done(JSON.stringify(wines))

              return wines;
            }
          },
        },
        system: WINE_MATCHING_PROMPT,
        messages: [...history]
      })

      spinnerStream.done(null)

      for await (const delta of result.fullStream) {
        const { type } = delta

        if (type === 'text-delta') {
          const { textDelta } = delta

          textContent += textDelta
          messageStream.update(<BotMessage>{textContent}</BotMessage>)
        } else if (type === 'tool-call') {
          console.log('Tool call:', delta)

          const { toolName, args } = delta

          if (toolName === 'searchWines') {
            console.log("Tool call search for wines", args);
            searchWinesArgs = args;

            uiStream.update(
              <BotCard>
                <WineSearchResultsCarrousel wines={[]} args={args} matchId={matchId} />
              </BotCard>
            )

            console.log('Search wines: completed');
          }
        }
      }

      console.log('AI done, update the message history', textContent, foundWines);
      const newMessages: Message[] = [];
      newMessages.push({
        id: nanoid(),
        role: 'assistant',
        content: textContent
      })

      if (searchWinesArgs) {
        newMessages.push({
          id: nanoid(),
          role: 'assistant',
          content:
            "Voilà le vin qui me paraît parfait pour toi.",
          display: {
            name: 'searchWines',
            props: {
              args: searchWinesArgs,
              wines: foundWines
            }
          }
        })
      }

      aiState.done({
        ...aiState.get(),
        interactions: [],
        messages: [
          ...aiState.get().messages,
          ...newMessages
        ]
      })

      uiStream.done()
      textStream.done()
      messageStream.done()
    } catch (e) {
      console.error(e)

      const error = new Error(
        'The AI got rate limited, please try again later.'
      )
      uiStream.error(error)
      textStream.error(error)
      messageStream.error(error)
      aiState.done({})
    }
  })()

  return {
    id: nanoid(),
    attachments: uiStream.value,
    spinner: spinnerStream.value,
    display: messageStream.value
  }
}

export type Message = {
  role: 'user' | 'assistant' | 'system' | 'function' | 'data' | 'tool'
  content: string
  id?: string
  name?: string
  display?: {
    name: string
    props: Record<string, any>
  }
}

export type AIState = {
  matchId: string
  interactions?: string[]
  messages: Message[]
}

export type UIState = {
  id: string
  display: React.ReactNode
  spinner?: React.ReactNode
  attachments?: React.ReactNode
}[]

export const AI = createAI<AIState, UIState>({
  actions: {
    submitUserMessage,
    // requestCode,
    // validateCode,
    // describeImage
  },
  initialUIState: [],
  initialAIState: { matchId: nanoid(), interactions: [], messages: [] },
  onGetUIState: async () => {
    'use server'

    const aiState = getAIState();

    if (aiState) {
      const uiState = getUIStateFromAIState(aiState)
      return uiState
    }
  },
  onSetAIState: async ({ state, done }) => {
    'use server'
    
    if (done) {
      
      const { matchId, messages } = state;
      const title = messages[0].content
      await saveMatch(matchId, title, messages);
    }
  }
})

export const getUIStateFromAIState = (aiState: AIState) => {
  return aiState.messages
    .filter(message => message.role !== 'system')
    .map((message, index) => ({
      id: `${message.id}`,
      content: message.content,
      display:
        message.role === 'assistant' ? (
          message.display?.name === 'searchWines' ? (
            <BotCard>
              <WineSearchResultsCarrousel wines={message.display.props.wines} args={message.display.props.args} matchId={aiState.matchId} />
            </BotCard>
          ) : message.display?.name === 'showSeatPicker' ? (
            <BotCard>
              <br />
            </BotCard>
          ) : (
            <BotMessage>{message.content}</BotMessage>
          )
        ) : message.role === 'user' ? (
          <UserMessage firstMessage={index===0}>{message.content}</UserMessage>
        ) : (
          <BotMessage>{message.content}</BotMessage>
        )
    }))
}

Let me know if this is helpful.

from ai.

threefoldo avatar threefoldo commented on June 12, 2024

I have added the two export directives and update "ai" to 3.1.3, but the streaming still doesn't work. It works perfect on local machine, but not on vercel.

Here is the code:

// page.tsx

"use client";

export const dynamic = 'force-dynamic';
export const maxDuration = 60;

...

const generateContent = async (user_prompt) => {
    console.log('generate lesson plan:\n', user_prompt);
    const { object } = await generate(user_prompt);
    for await (const partialObject of readStreamableValue(object)) {
      if (partialObject) {
        setContent((prev) => ({ ...prev, ...partialObject }));
      }
    }
  };

...

<OutputForm isLoading={isLoading} content={content} setContent={setContent} />

// action.tsx

'use server';

...

export default async function generate(prompt: string) {
  'use server';

  const system_prompt = prompts['system_template'];

  const stream = createStreamableValue();

  (async () => {
    const { partialObjectStream } = await streamObject({
        model: openai('gpt-4-turbo'),
        system: system_prompt,
        prompt: prompt,
        schema: z.object({
            standards: z.string().optional(),
        }),
    });

    for await (const partialObject of partialObjectStream) {
      stream.update(partialObject);
    }

    stream.done();
  })();

  return { object: stream.value };
}

from ai.

threefoldo avatar threefoldo commented on June 12, 2024

@threefoldo do you have a github repo with a reproduction that you could share?

@lgrammel I have posted the code below. If that doesn't solve the problem, I will try to create a new repo.

from ai.

threefoldo avatar threefoldo commented on June 12, 2024

I tried this code, it's the same problem, streaming works on local env, but not on vercel.

with this line at the very top of the page.tsx:

export const dynamic = 'force-dynamic';

https://sdk.vercel.ai/examples/next-app/basics/streaming-object-generation#server

from ai.

valstu avatar valstu commented on June 12, 2024

Facing the same problem, was getting 504 from the server action. Now I added following lines

export const dynamic = 'force-dynamic';
export const maxDuration = 300;

I don't get 504 anymore but the streaming won't work, I just get the final response when it is done by LLM (GPT-4 in this case).

EDIT: Locally everthing works as expected

from ai.

threefoldo avatar threefoldo commented on June 12, 2024

Okay, adding this to layout.tsx

export const dynamic = 'force-dynamic'

Seemed to work for me.

That works for me, too, thank you very much.

from ai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.