Im trying to run this app GitHub - togethercomputer/llamaindex-chatbot: A RAG Chatbot with Next.js, Together.ai and Llama Index
But im getting this error [Bug] Problem working with Next.js · Issue #210 · huggingface/transformers.js · GitHub and am unable to fix it
its for a job assignment
Can someone please help me
The solution is provided in the link you have provided but you haven’t pointed what kind of big you encountered.
You need to share screenshot of the bug you encountered
Module parse failed: Unexpected character ‘�’ (1:0)
You may need an appropriate loader to handle this file type, currently no
loaders are configured to process this file. See Concepts | webpack
(Source code omitted for this binary file)Import trace for requested module:
./node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node
./node_modules/onnxruntime-node/bin/napi-v3/ sync ^./././onnxruntime_binding.node$
./node_modules/onnxruntime-node/dist/binding.js
./node_modules/onnxruntime-node/dist/backend.js
./node_modules/onnxruntime-node/dist/index.js
./node_modules/@xenova/transformers/src/backends/onnx.js
./node_modules/@xenova/transformers/src/env.js
./node_modules/@xenova/transformers/src/transformers.js
./node_modules/llamaindex/dist/index.mjs
./app/api/chat/route.ts
POST /api/chat 500 in 94176ms
GET / 500 in 72ms
Its the same error as in the Github issue, I have tried the solutions there but still get the same error.
It occurs on POST request to /api/chat
api/chat 500 error would internal server error right?
did you raise this issue on llama api?
api/chat is a backend endpoint of the next application GitHub - togethercomputer/llamaindex-chatbot: A RAG Chatbot with Next.js, Together.ai and Llama Index which calls LlamaIndex functions to process an uploaded pdf and performs Q and A over it.
did you try xenova comment of server and client side documentation?
api chat endpoint error 500, usually is telling you to check API documentation, check your database, check your request: Make sure there are no errors in the configuration of your request, such as typos, whitespaces, or invalid JSON formatting, try different browser/different network, reset file and folder permissions.
Also as you are using together.ai and llamaindex, this could also be because you might be using api endpoint from server side instead of client side or vice versa, throwing this api chat error.