To use this model you need to have the @mlc-ai/web-llm module installed. This can be installed using npm install -S @mlc-ai/web-llm.
@mlc-ai/web-llm
npm install -S @mlc-ai/web-llm
You can see a list of available model records here: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts
// Initialize the ChatWebLLM model with the model record.const model = new ChatWebLLM({ model: "Phi2-q4f32_1", chatOptions: { temperature: 0.5, },});// Call the model with a message and await the response.const response = await model.invoke([ new HumanMessage({ content: "My name is John." }),]); Copy
// Initialize the ChatWebLLM model with the model record.const model = new ChatWebLLM({ model: "Phi2-q4f32_1", chatOptions: { temperature: 0.5, },});// Call the model with a message and await the response.const response = await model.invoke([ new HumanMessage({ content: "My name is John." }),]);
Optional
Static
Protected
Generated using TypeDoc
To use this model you need to have the
@mlc-ai/web-llm
module installed. This can be installed usingnpm install -S @mlc-ai/web-llm
.You can see a list of available model records here: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts
Example