Giter Site home page Giter Site logo

Comments (7)

qinxuye avatar qinxuye commented on August 17, 2024 1

This is due to the upgrade of transformers, for workaround, please downgrade transformers.

pip install 'transformers==4.41.2'

from inference.

huangl22 avatar huangl22 commented on August 17, 2024

我也遇到了这个问题,请问解决了吗?

from inference.

Jinju-Sun avatar Jinju-Sun commented on August 17, 2024

我也遇到了同样的问题,蹲一个解决办法

from inference.

xxch avatar xxch commented on August 17, 2024

我也切换到了pip install 'transformers==4.41.2' inference==0.13.0 python=3.11 虽然不报上面的错误了,页面上问答也正常。但是报错
--- Logging error --- Traceback (most recent call last): File "/app/miniconda/envs/inference/lib/python3.11/logging/handlers.py", line 73, in emit if self.shouldRollover(record): ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/handlers.py", line 196, in shouldRollover msg = "%s\n" % self.format(record) ^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 953, in format return fmt.format(record) ^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 687, in format record.message = record.getMessage() ^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 377, in getMessage msg = msg % self.args ~~~~^~~~~~~~~~~ TypeError: not all arguments converted during string formatting Call stack: File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) File "/app/miniconda/envs/inference/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() File "/app/miniconda/envs/inference/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xoscar/api.py", line 402, in _wrapper return next(_gen) File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/core/model.py", line 318, in _to_json_generator for v in gen: File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/model/llm/utils.py", line 558, in _to_chat_completion_chunks for i, chunk in enumerate(chunks): File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/model/llm/pytorch/chatglm.py", line 259, in _stream_generator for chunk_text, _ in self._model.stream_chat( File "/app/miniconda/envs/inference/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/home/resoft/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 1012, in stream_chat for outputs in self.stream_generate(**inputs, past_key_values=past_key_values, File "/app/miniconda/envs/inference/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/home/resoft/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 1061, in stream_generate logger.warn( Message: 'Bothmax_new_tokens(=512) andmax_length(=518) seem to have been set. max_new_tokenswill take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)' Arguments: (<class 'UserWarning'>,)

from inference.

xxch avatar xxch commented on August 17, 2024

glm8
glm4使用的transformers 版本比较低,其他模型也是各不相同。这可咋办

from inference.

firrice avatar firrice commented on August 17, 2024

我也切换到了pip install 'transformers==4.41.2' inference==0.13.0 python=3.11 虽然不报上面的错误了,页面上问答也正常。但是报错 --- Logging error --- Traceback (most recent call last): File "/app/miniconda/envs/inference/lib/python3.11/logging/handlers.py", line 73, in emit if self.shouldRollover(record): ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/handlers.py", line 196, in shouldRollover msg = "%s\n" % self.format(record) ^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 953, in format return fmt.format(record) ^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 687, in format record.message = record.getMessage() ^^^^^^^^^^^^^^^^^^^ File "/app/miniconda/envs/inference/lib/python3.11/logging/__init__.py", line 377, in getMessage msg = msg % self.args ~~~~^~~~~~~~~~~ TypeError: not all arguments converted during string formatting Call stack: File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() File "/app/miniconda/envs/inference/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) File "/app/miniconda/envs/inference/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() File "/app/miniconda/envs/inference/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xoscar/api.py", line 402, in _wrapper return next(_gen) File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/core/model.py", line 318, in _to_json_generator for v in gen: File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/model/llm/utils.py", line 558, in _to_chat_completion_chunks for i, chunk in enumerate(chunks): File "/app/miniconda/envs/inference/lib/python3.11/site-packages/xinference/model/llm/pytorch/chatglm.py", line 259, in _stream_generator for chunk_text, _ in self._model.stream_chat( File "/app/miniconda/envs/inference/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/home/resoft/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 1012, in stream_chat for outputs in self.stream_generate(**inputs, past_key_values=past_key_values, File "/app/miniconda/envs/inference/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 35, in generator_context response = gen.send(None) File "/home/resoft/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 1061, in stream_generate logger.warn( Message: 'Bothmax_new_tokens(=512) andmax_length(=518) seem to have been set. max_new_tokenswill take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)' Arguments: (<class 'UserWarning'>,)

我也是同样的错误,TypeError: not all arguments converted during string formatting

from inference.

koko426 avatar koko426 commented on August 17, 2024

同样的错误,TypeError: not all arguments converted during string formatting
'transformers==4.41.2'

from inference.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.