CodinIT.dev Build With AI In Local Enviroment or With Our Web App
Documentation β’ Website β’ Desktop App Docs β’ Features β’ Get Started
This is a pnpm workspace monorepo containing two applications:
| Application | Location | Framework | Port | Deploy |
|---|---|---|---|---|
| @codinit/web | / (root) |
Next.js 14 | 3000 | Vercel |
| @codinit/desktop | /apps/desktop |
Remix + Electron | 5173 | Desktop installers |
π WORKSPACE.md - Complete workspace guide (commands, deployment, architecture)
- π AI-Powered Code Generation - Multiple LLM providers (OpenAI, Anthropic, Google AI, and more)
- β‘ Real-time Code Execution - Secure E2B sandboxes with live preview
- π Multiple Development Environments - Python, Next.js, Vue.js, Streamlit, Gradio
- π¬ Streaming AI Responses - Real-time UI updates with Vercel AI SDK
- π Secure Authentication - Supabase auth with Row Level Security
- π¦ Package Installation - Install any npm or pip package on the fly
- πΈ OpenAI (GPT-5, GPT-4)
- πΈ Anthropic (Claude models)
- πΈ Google AI (Gemini)
- πΈ Groq (Fast inference)
- πΈ Fireworks AI
- πΈ Together AI
- πΈ Mistral AI
- πΈ xAI (Grok)
- πΈ DeepSeek
- πΈ Ollama (Local models)
- π Python Data Analyst - Jupyter-style execution with data visualization
- βοΈ Next.js Developer - Full-stack React applications
- π¨ Vue.js Developer - Vue 3 applications
- π Streamlit Developer - Data apps and dashboards
- π― Gradio Developer - ML model interfaces
- Next.js 14 (App Router, Server Actions)
- shadcn/ui + TailwindCSS for beautiful UI
- Vercel AI SDK for LLM streaming
- E2B for secure code execution
- Supabase for database and auth
- TypeScript for type safety
β Give us a star if you like this project!
- git
- Recent version of Node.js and npm package manager
- E2B API Key
- LLM Provider API Key
In your terminal:
git clone https://github.com/Gerome-Elassaad/CodingIT.git
Enter the repository:
cd CodingIT
Run the following to install the required dependencies for both workspaces:
pnpm install
Note: This project uses pnpm workspaces. The command above installs dependencies for both the web app and desktop app.
Create a .env.local file and set the following:
# Get your API key here - https://e2b.dev/
E2B_API_KEY="your-e2b-api-key"
# OpenAI API Key
OPENAI_API_KEY=
# Other providers
ANTHROPIC_API_KEY=
GROQ_API_KEY=
FIREWORKS_API_KEY=
TOGETHER_API_KEY=
GOOGLE_AI_API_KEY=
GOOGLE_VERTEX_CREDENTIALS=
MISTRAL_API_KEY=
XAI_API_KEY=
### Optional env vars
# Domain of the site
NEXT_PUBLIC_SITE_URL=
# Rate limit
RATE_LIMIT_MAX_REQUESTS=
RATE_LIMIT_WINDOW=
# Vercel/Upstash KV (short URLs, rate limiting)
KV_REST_API_URL=
KV_REST_API_TOKEN=
# Supabase (auth)
SUPABASE_URL=
SUPABASE_ANON_KEY=
# PostHog (analytics)
NEXT_PUBLIC_POSTHOG_KEY=
NEXT_PUBLIC_POSTHOG_HOST=
### Disabling functionality (when uncommented)
# Disable API key and base URL input in the chat
# NEXT_PUBLIC_NO_API_KEY_INPUT=
# NEXT_PUBLIC_NO_BASE_URL_INPUT=
# Hide local models from the list of available models
# NEXT_PUBLIC_HIDE_LOCAL_MODELS=Web App (Next.js):
pnpm dev
Visit http://localhost:3000
Desktop App (Electron + Remix):
pnpm desktop:dev
Or:
cd apps/desktop && pnpm dev
Visit http://localhost:5173
Web App:
pnpm build
Desktop App:
pnpm desktop:build # Build all platforms
pnpm desktop:build:mac # macOS only
pnpm desktop:build:win # Windows only
pnpm desktop:build:linux # Linux only
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.
-
Make sure E2B CLI is installed and you're logged in.
-
Add a new folder under sandbox-templates/
-
Initialize a new template using E2B CLI:
e2b template init
This will create a new file called
e2b.Dockerfile. -
Configure the Dockerfile
Example Streamlit template:
# Use Debian-based base image FROM python:3.11-slim # Install dependencies RUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly # Set working directory WORKDIR /home/user COPY . /home/user
-
Set the start command in
e2b.toml:start_cmd = "cd /home/user && streamlit run app.py --server.port 8501 --server.address 0.0.0.0"
-
Deploy the template
e2b template build --name <template-name>
Success message:
β Building sandbox template <template-id> <template-name> finished. -
Register in templates.json
Add your template to
lib/templates.json:"custom-template": { "name": "Custom Template", "lib": ["dependency1", "dependency2"], "file": "main.py", "instructions": "Template-specific instructions for the AI.", "port": 8080 }
-
Add template logo (optional)
Place logo SVG in
public/thirdparty/templates/
-
Register the model in
lib/models.json:{ "id": "custom-model-id", "name": "Custom Model Name", "provider": "Provider Name", "providerId": "provider-id", "multiModal": true }Parameters:
id: Unique model identifiername: Display name in the UIprovider: Human-readable provider nameproviderId: Provider configuration keymultiModal: Whether the model supports images/vision
-
Configure provider in
lib/models.ts:Add to the
providerConfigsobject:'custom-provider': () => createOpenAI({ apiKey: apiKey || process.env.CUSTOM_PROVIDER_API_KEY, baseURL: baseURL || 'https://api.customprovider.com/v1' })(modelNameString)
-
Set output mode (optional) in
getDefaultMode:if (providerId === 'custom-provider') { return 'json' // or 'tool' or 'object' }
-
Add environment variable:
CUSTOM_PROVIDER_API_KEY="your-api-key" -
Add provider logo (optional):
Place SVG logo in
public/thirdparty/logos/
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and test thoroughly
- Run linting:
npm run lint - Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
