Deploy your service with a cloud provider such as Vercel, Google Cloud Platform (GCP), Amazon Web Services (AWS), or any other provider you are comfortable with. Ensure your service is publicly accessible.
Optional: Deploy Your Service During development using make-agent (recommended) ngrok or localtunnel
Instead of deploying your service to a cloud provider, you can use ngrok or localtunnel to expose your local development server to the internet. This is particularly useful for testing and development purposes.
Using make-agent (suggested)
Install make-agent: install our cli tool to test your agent live on the Bitte Playground.
pnpm install -D make-agent
Set up Dynamic Plugin Manifest: serve your plugin-manifest with a dynamic server URL using the url from bitte.dev.json (you can use next rewrites on nextjs)
Add dev script:
"dev:make-agent": "next dev & pnpm make-agent dev -p 3000",
Run dev script: run pnpm dev:make-agent to live test your agent on the Bitte Plyground.
Using ngrok
Install ngrok: If you haven't installed ngrok yet, you can download it from ngrok's official website.
Expose Your Local Server: Start your local server (e.g., on port 3000), then use ngrok to expose it.
ngrokhttp3000
This command will generate a public URL (e.g., https://<random-id>.ngrok.io) that you can use to access your service over the internet.
Update Plugin Manifest: Ensure that the plugin manifest file (ai-plugin.json) points to the ngrok URL:
{"schema_version":"v1","name_for_human":"Your Plugin Name","name_for_model":"plugin_name","description_for_human":"A description for your plugin.","description_for_model":"Detailed description for the model.","auth": {"type":"none" },"api": {"type":"openapi","url":"https://<random-id>.ngrok.io/.well-known/ai-plugin.json" }}
Using localtunnel
Install localtunnel: You can install localtunnel globally using npm:
npminstall-glocaltunnel
Expose Your Local Server: Start your local server, then use localtunnel to expose it:
lt--port3000
This command will generate a public URL (e.g., https://<subdomain>.loca.lt) for your service.
Update Plugin Manifest: Make sure the plugin manifest file (ai-plugin.json) uses the localtunnel URL:
{"schema_version":"v1","name_for_human":"Your Plugin Name","name_for_model":"plugin_name","description_for_human":"A description for your plugin.","description_for_model":"Detailed description for the model.","auth": {"type":"none" },"api": {"type":"openapi","url":"https://<subdomain>.loca.lt/.well-known/ai-plugin.json" }}