QuickStart Guide

ChainGPT Web3 AI Chatbot & LLM – QuickStart Guide (Tech‑Team‑Verified)

This short guide shows you how to call ChainGPT’s Web3 LLM in under five minutes—either with plain HTTPS or the official JavaScript SDK. Everything here mirrors the internal Tech‑Team reference, not the older public docs.


1. Prerequisites ✔︎

What you need
Notes

ChainGPT account + API key

Create a key in AI Hub → API Dashboard and copy it once.

Credits

Each request = 0.5 credits.Turning chatHistory on adds +0.5 credits per call

HTTP client

cURL, Postman, fetch, axios, etc.

(If using SDK) Node ≥ 14

Install from npm.

Secure key storage

Put the key in an env var, secret manager, or server config—never ship it to the browser.

Tip export CHAINGPT_API_KEY="sk‑***" so the examples just work.


2. Integration Options

Option
When to choose
How it works

REST API

Any language / server

POST https://api.chaingpt.org/chat/stream (single endpoint for blob and streaming)

JavaScript SDK

Node or a build‑step web app

npm install @chaingpt/generalchat → call createChatBlob() or createChatStream()


3. QuickStart via REST API

3.1 Authentication & Endpoint

There is no separate /chat endpoint—all chat traffic goes to /chat/stream.

3.2 Single‑Shot (JSON “blob”) response

If you don’t treat the response as a stream, cURL (or your HTTP client) waits until the LLM finishes, then returns one JSON:

3.3 Streaming in real time

-N turns off buffering so you see partial chunks instantly. Concatenate chunks on arrival until the connection closes.

3.4 Conversation Memory (chat history)

Subsequent calls with the same sdkUniqueId and chatHistory:"on" include the prior Q&A so the bot answers in context.

Cost impact: chatHistory:"on" consumes +0.5 credits per request.

3.5 Custom Context & Tone

Set useCustomContext:true; include only the fields you need. Omit contextInjection to fall back to the default context configured in AI Hub.


4. QuickStart via JavaScript SDK

4.1 Install & init

4.2 Blob response

createChatBlob() buffers the /chat/stream response for you.

4.3 Streaming response

4.4 Chat history & context

SDK parameters match the REST fields 1‑for‑1.


5. Error Codes — Quick Reference

Code
Meaning
Fix

401

Missing / bad API key

Check Authorization header.

402 / 403

Out of credits

Top‑up in AI Hub.

400

Bad JSON / missing field

Verify model and question.

5xx

Server error

Retry after brief delay.


6. Best Practices

  • Hide the key. Never embed it in client JavaScript—proxy via your backend.

  • Watch credits. One call = 0.5 credits; history = +0.5. Set alerts.

  • Use unique sdkUniqueId. One per user/session keeps histories clean.

  • Handle stream ‘end’ & ‘error’. Always close or retry gracefully.

  • Stay updated. Check the npm changelog for new SDK features or model IDs.


You’re now ready to integrate ChainGPT using the exact endpoints and parameters defined by the Tech Team. For deeper dives, see the full API Reference or SDK docs. Happy building!

Last updated

Was this helpful?