Home-llm model returns the response every time differently

21 Views Asked by At

I am trying to integrate home-llm model with my nodejs application for testing purposes Below is the system prompt example I am following from this home-llm github. But with my configuration of devices inside the system prompt

https://github.com/acon96/home-llm/blob/develop/README.md

`You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.
Services: light.turn_off(), light.turn_on(brightness,rgb_color)
Devices:
light.a2eb-11ed-bef4-c7c47e09aabb 'Lamp 1' = on;80%
light.be6f-11ed-a65f-f5afc3dc38ee 'Lamp 2' = on;80%
light.ddd0-11ed-928c-a95e465b84db 'Lamp 3' = on;80%
light.dec2-11ed-928c-a95e465b84db 'Lamp 4' = on;80%`,

Below is my typescript code

    class OllamaService {
  constructor(private readonly userService: UserService) {}

  public async responseFromOllama(data: any) {
    const { userId, message } = data;
    let systemPromptName;
    systemPromptName = data.systemPromptName || 'Al';

    const user: User | undefined = await this.userService.findOne(userId);
    let systemPrompt = this.generateSystemPrompt(user?.devices, systemPromptName);

    console.log(systemPrompt);
    const ollama = new Ollama();

    const response = await ollama.generate({
      model: 'homellm:latest',
      prompt: message,
      format: 'json',
      stream: false,
      system: systemPrompt,
    });
    return response;
  }

  public generateSystemPrompt(devices: any, systemPromptName: string): string {
    let systemPrompt = `You are '${systemPromptName}', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.\n`;
    systemPrompt += `Services: light.turn_off(), light.turn_on(brightness,rgb_color)\n`;
    systemPrompt += `Devices:\n`;
    devices.forEach((device: any) => {
      let deviceInfo = `light.${device.id} '${device.name}' = on;80%`;
      systemPrompt += `${deviceInfo}\n`;
    });

    return systemPrompt;
  }
}

I send a request to my code

{
    "userId": "abc123",
    "message": "Turn on Lamp 2"
}

I should expect the response something like as mentioned in this github home-llm model https://github.com/acon96/home-llm/blob/develop/README.md

    turning on the Lamp 2 for you now
```homeassistant
{ "service": "light.turn_on", "target_device": "light.be6f-11ed-a65f-f5afc3dc38ee" }
```

But I always keep on getting different responses and not as above. The success rate is not even 10%. Is it because of the model I have downloaded the home-llm model from hugging face.

0

There are 0 best solutions below