Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformers.js have bugs in next.js ( client-side ) #1026

Open
1 of 5 tasks
LudovicPDG opened this issue Nov 13, 2024 · 0 comments
Open
1 of 5 tasks

Transformers.js have bugs in next.js ( client-side ) #1026

LudovicPDG opened this issue Nov 13, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@LudovicPDG
Copy link

System Info

Transfomer.js version: "@huggingface/transformers": "^3.0.2"

Next.js version: "next": "14.2.5"

React version: "react": "^18",

OS that run next.js : windows11

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

I'm very interested in the world of WebML and for me Transformer.js is one of the most promising libraries in this field, so I'd like to thank all the contributors to this library.

But right now I'm working on a personal project make in next.js and I've run into a lot of issues trying to get transformer.js to work in client-side components.

First when I set the configuration of next.js as the documentation states i.e.

/** @type {import('next').NextConfig} */
const nextConfig = {
    // (Optional) Export as a static site
    // See https://nextjs.org/docs/pages/building-your-application/deploying/static-exports#configuration
    output: 'export', // Feel free to modify/remove this option

    // Override the default webpack configuration
    webpack: (config) => {
        // See https://webpack.js.org/configuration/resolve/#resolvealias
        config.resolve.alias = {
            ...config.resolve.alias,
            "sharp$": false,
            "onnxruntime-node$": false,
        }
        return config;
    },
}

module.exports = nextConfig

I get this error

Module not found: Can't resolve './'

I get this error even if i make this configuration

const nextConfig = {
  output: "standalone",
  // https://nextjs.org/docs/app/api-reference/next-config-js/serverExternalPackages
  serverExternalPackages: ["@huggingface/transformers"],
};

If i make this configuration i have the same error of previously

The only configuration that work for me is

/** @type {import('next').NextConfig} */
const nextConfig = {
  output: 'standalone',
  webpack: (config ) => {
    
    config.resolve.alias = {
      ...config.resolve.alias,
      "@huggingface/transformers": require.resolve("@huggingface/transformers"),
      "sharp$": false,
      "onnxruntime-node$": false,
    };
    return config;
  },
};

module.exports = nextConfig;

But after that I still have problems when I want to use the pipeline function to do text-generation with onnx-community/Llama-3.2-1B-Instruct as the documentation states https://huggingface.co/blog/llama32#transformersjs

my code :

"use client"

export async function handleFreeMode(){
  const { pipeline } = await import("@huggingface/transformers")
    const generator = await pipeline("text-generation", "onnx-community/Llama-3.2-1B-Instruct");

    // Define the list of messages
    const messages = [
      { role: "system", content: "You are a helpful assistant." },
      { role: "user", content: "Tell me a joke." },
    ];

    // Generate a response
    const output = await generator(messages, { max_new_tokens: 128 });
    console.log(output[0].generated_text.at(-1).content);
    return output[0].generated_text.at(-1).content;
}

the error :

transformers.cjs:3971 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'create')
    at createInferenceSession (transformers.cjs:3971:45)
    at eval (transformers.cjs:7090:108)
    at async Promise.all (:3000/fr/148/index 0)
    at async constructSessions (transformers.cjs:7087:31)
    at async Promise.all (:3000/fr/148/index 0)
    at async LlamaForCausalLM.from_pretrained (transformers.cjs:7643:20)
    at async AutoModelForCausalLM.from_pretrained (transformers.cjs:12826:20)
    at async Promise.all (:3000/fr/148/index 1)
    at async loadItems (transformers.cjs:17386:5)
    at async pipeline (transformers.cjs:17316:21)
    at async handleFreeMode (ClientSide.ts:5:23)
    at async handleSubmit (Chatbot.tsx:60:20)
Thank you in advance if anyone in the community can help me solve this problem.

Reproduction

  1. Use next.js v14.2

  2. Use transformer.js v3

@LudovicPDG LudovicPDG added the bug Something isn't working label Nov 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant