A simple, configurable Twitter bot that generates and posts tweets using local AI (Ollama). Perfect for beginners to create automated Twitter content like news updates, weather reports, or daily posts.
- AI-Powered Tweets: Uses Ollama (local LLM) to generate engaging tweets
- Data Sources: Optionally fetch data from URLs (APIs, websites) to include in tweets
- Configurable Prompts: Customize what the AI generates with prompt templates
- Dry Run Mode: Test everything without posting
- State Management: Tracks last run time
- Scheduling: Designed for cron jobs (e.g., daily posts)
- Comprehensive Logging: Logs all activity and errors
- Python 3.8+ (tested on 3.13)
- Twitter/X developer account
- Ollama installed (for local AI)
-
Clone or download this repository
-
Install dependencies:
pip install -r requirements.txt
-
Install Ollama:
# Install Ollama curl -fsSL https://ollama.com/install.sh | sh # Download a model (e.g., Llama 3) ollama pull llama3 # Verify it's running (starts server automatically) ollama list
-
Get Twitter API Keys:
- Go to Twitter Developer Portal
- Create a new app or use existing
- Get your API keys from the "Keys and Tokens" section:
- Consumer Key (API Key)
- Consumer Secret (API Key Secret)
- Access Token
- Access Token Secret
-
Configure the Bot: Copy
config.json.exampletoconfig.jsonand edit with your keys:cp config.json.example config.json
Edit
config.json:{ "twitter": { "consumer_key": "YOUR_CONSUMER_KEY", "consumer_secret": "YOUR_CONSUMER_SECRET", "access_token": "YOUR_ACCESS_TOKEN", "access_secret": "YOUR_ACCESS_SECRET" }, "ollama": { "host": "http://localhost:11434/api/generate", "model": "llama3" }, "prompt_template": "Write a short tweet about: {data}", "source_url": "https://api.example.com/data", "dry_run": true }
Run in dry-run mode first:
python TwitterBotTemplate.pyCheck bot.log for output. It should show what tweet it would post without actually posting.
Once testing is successful:
- Set
"dry_run": falseinconfig.json - Run the bot:
python TwitterBotTemplate.py - Set up cron job for automatic runs (e.g., daily at 9 AM)
consumer_key,consumer_secret,access_token,access_secret: Your Twitter API keys
host: Ollama API endpoint (default: http://localhost:11434/api/generate)model: Model name (e.g., "llama3", "mistral")
prompt_template: Template for AI prompt (use {data} for fetched content)source_url: Optional URL to fetch data from (JSON or text)post_prob: Probability to post each run (0.0-1.0, default 1.0 = always)
db_path: Database file (default: "bot_state.db")log_path: Log file (default: "bot.log")dry_run: Test mode (default: false)
python TwitterBotTemplate.pyAdd to crontab for daily posts at 9 AM:
crontab -eAdd this line:
0 9 * * * cd /path/to/your_bot && /usr/bin/python3 TwitterBotTemplate.py >> bot.log 2>&1
Use Task Scheduler:
- Create new task
- Set trigger to daily at 9 AM
- Action: Start a program
- Program:
python.exe - Arguments:
TwitterBotTemplate.py - Start in:
C:\path\to\your_bot
tail -f bot.log- 401 Unauthorized (Twitter): Check Twitter API keys
- Connection refused (Ollama): Ensure Ollama is running (
ollama serve) - No data from source_url: Check URL and API key
- Rate Limited: Twitter has posting limits; space out runs
Run unit tests:
python -m pytest tests/- Set
source_urlto any API endpoint or webpage - Modify
prompt_templateto include {data} placeholder - For complex sources, extend
fetch_data()or add custom modules
Examples:
"Write a daily news summary: {data}"(with news API)"Generate a fun fact tweet""Tweet about today's date: {data}"(with date API)
The code is modular – add new functions for different content types or posting logic.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Never commit API keys to version control
- Use environment variables for sensitive data in production
- Rotate API keys regularly
- Monitor bot activity for abuse
This template is designed to be simple yet expandable. Start with basic tweets, then add sources and custom logic as needed.
