topic-monitor
Monitor topics of interest and proactively alert when important developments occur.
Installation
npx clawhub@latest install topic-monitorView the full skill documentation and source below.
Documentation
Topic Monitor
Monitor what matters. Get notified when it happens.
Topic Monitor transforms your assistant from reactive to proactive by continuously monitoring topics you care about and intelligently alerting you only when something truly matters.
⚡ Quick Start (New in v1.2.0!)
Just want to monitor one topic? One command:
python3 scripts/quick.py "AI Model Releases"
That's it! This creates a topic with sensible defaults:
- Query: Auto-generated from topic name
- Keywords: Extracted from topic name
- Frequency: Daily
- Importance: Medium
- Channel: Telegram
Quick Start Options
# Basic - just a topic name
python3 scripts/quick.py "Bitcoin Price"
# With keywords
python3 scripts/quick.py "Security CVEs" --keywords "CVE,vulnerability,critical"
# High priority, hourly checks
python3 scripts/quick.py "Production Alerts" --frequency hourly --importance high
# Custom query
python3 scripts/quick.py "Competitor News" --query "CompanyName product launch funding"
# Different channel
python3 scripts/quick.py "Team Updates" --channel discord
Quick Start vs Full Setup
| Feature | Quick Start | Full Setup |
| Speed | ⚡ 1 command | 📝 Wizard |
| Defaults | Smart | Customizable |
| Use case | Single topic | Multiple topics |
| Configuration | Minimal | Full control |
python3 scripts/manage_topics.py edit ai-model-releases --frequency hourly
Core Capabilities
Full Setup (Interactive Wizard)
For configuring multiple topics or advanced options:
python3 scripts/setup.py
The wizard will guide you through:
The wizard creates config.json with your preferences. You can always edit it later or use manage_topics.py to add/remove topics.
Example session:
🔍 Topic Monitor - Setup Wizard
What topics do you want to monitor?
> AI Model Releases
> Security Vulnerabilities
>
--- Topic 1/2: AI Model Releases ---
Search query for 'AI Model Releases' [AI Model Releases news updates]: new AI model release announcement
Keywords to watch for in 'AI Model Releases'?
> GPT, Claude, Llama, release
--- Topic 2/2: Security Vulnerabilities ---
Search query for 'Security Vulnerabilities' [Security Vulnerabilities news updates]: CVE critical vulnerability patch
Keywords to watch for in 'Security Vulnerabilities'?
> CVE, vulnerability, critical, patch
How often should I check for updates?
1. hourly
2. daily *
3. weekly
✅ Setup Complete!
Quick Start
Already know what you're doing? Here's the manual approach:
# Initialize config from template
cp config.example.json config.json
# Add a topic
python3 scripts/manage_topics.py add "Product Updates" \
--keywords "release,update,patch" \
--frequency daily \
--importance medium
# Test monitoring (dry run)
python3 scripts/monitor.py --dry-run
# Set up cron for automatic monitoring
python3 scripts/setup_cron.py
Topic Configuration
Each topic has:
- name - Display name (e.g., "AI Model Releases")
- query - Search query (e.g., "new AI model release announcement")
- keywords - Relevance filters (["GPT", "Claude", "Llama", "release"])
- frequency -
hourly,daily,weekly - importance_threshold -
high(alert immediately),medium(alert if important),low(digest only) - channels - Where to send alerts (["telegram", "discord"])
- context - Why you care (for AI contextual summaries)
Example config.json
{
"topics": [
{
"id": "ai-models",
"name": "AI Model Releases",
"query": "new AI model release GPT Claude Llama",
"keywords": ["GPT", "Claude", "Llama", "release", "announcement"],
"frequency": "daily",
"importance_threshold": "high",
"channels": ["telegram"],
"context": "Following AI developments for work",
"alert_on": ["model_release", "major_update"]
},
{
"id": "tech-news",
"name": "Tech Industry News",
"query": "technology startup funding acquisition",
"keywords": ["startup", "funding", "Series A", "acquisition"],
"frequency": "daily",
"importance_threshold": "medium",
"channels": ["telegram"],
"context": "Staying informed on tech trends",
"alert_on": ["major_funding", "acquisition"]
},
{
"id": "security-alerts",
"name": "Security Vulnerabilities",
"query": "CVE critical vulnerability security patch",
"keywords": ["CVE", "vulnerability", "security", "patch", "critical"],
"frequency": "hourly",
"importance_threshold": "high",
"channels": ["telegram", "email"],
"context": "DevOps security monitoring",
"alert_on": ["critical_cve", "zero_day"]
}
],
"settings": {
"digest_day": "sunday",
"digest_time": "18:00",
"max_alerts_per_day": 5,
"deduplication_window_hours": 72,
"learning_enabled": true
}
}
Scripts
manage_topics.py
Manage research topics:
# Add topic
python3 scripts/manage_topics.py add "Topic Name" \
--query "search query" \
--keywords "word1,word2" \
--frequency daily \
--importance medium \
--channels telegram
# List topics
python3 scripts/manage_topics.py list
# Edit topic
python3 scripts/manage_topics.py edit eth-price --frequency hourly
# Remove topic
python3 scripts/manage_topics.py remove eth-price
# Test topic (preview results without saving)
python3 scripts/manage_topics.py test eth-price
monitor.py
Main monitoring script (run via cron):
# Normal run (alerts + saves state)
python3 scripts/monitor.py
# Dry run (no alerts, shows what would happen)
python3 scripts/monitor.py --dry-run
# Force check specific topic
python3 scripts/monitor.py --topic eth-price
# Verbose logging
python3 scripts/monitor.py --verbose
How it works:
digest.py
Generate weekly digest:
# Generate digest for current week
python3 scripts/digest.py
# Generate and send
python3 scripts/digest.py --send
# Preview without sending
python3 scripts/digest.py --preview
Output format:
# Weekly Research Digest - [Date Range]
## 🔥 Highlights
- **AI Models**: Claude 4.5 released with improved reasoning
- **Security**: Critical CVE patched in popular framework
## 📊 By Topic
### AI Model Releases
- [3 findings this week]
### Security Vulnerabilities
- [1 finding this week]
## 💡 Recommendations
Based on your interests, you might want to monitor:
- "Kubernetes security" (mentioned 3x this week)
setup_cron.py
Configure automated monitoring:
# Interactive setup
python3 scripts/setup_cron.py
# Auto-setup with defaults
python3 scripts/setup_cron.py --auto
# Remove cron jobs
python3 scripts/setup_cron.py --remove
Creates cron entries:
# Topic Monitor - Hourly topics
0 * * * * cd /path/to/skills/topic-monitor && python3 scripts/monitor.py --frequency hourly
# Topic Monitor - Daily topics
0 9 * * * cd /path/to/skills/topic-monitor && python3 scripts/monitor.py --frequency daily
# Topic Monitor - Weekly digest
0 18 * * 0 cd /path/to/skills/topic-monitor && python3 scripts/digest.py --send
AI Importance Scoring
The scorer uses multiple signals to decide alert priority:
Scoring Signals
HIGH priority (immediate alert):
- Major breaking news (detected via freshness + keyword density)
- Price changes >10% (for finance topics)
- Product releases matching your exact keywords
- Security vulnerabilities in tools you use
- Direct answers to specific questions you asked
MEDIUM priority (digest-worthy):
- Related news but not urgent
- Minor updates to tracked products
- Interesting developments in your topics
- Tutorial/guide releases
- Community discussions with high engagement
LOW priority (ignore):
- Duplicate news (already alerted)
- Tangentially related content
- Low-quality sources
- Outdated information
- Spam/promotional content
Learning Mode
When enabled (learning_enabled: true), the system:
Learning data stored in .learning_data.json (privacy-safe, never shared).
Memory Integration
Topic Monitor connects to your conversation history:
Example alert:
🔔 Dirac Live Update
Version 3.8 released with the room correction improvements you asked about last week.
Context: You mentioned struggling with bass response in your studio. This update includes new low-frequency optimization.
[Link] | [Full details]
How it works:
memory_hints.md (optional)
Help the AI connect dots:
# Memory Hints for Topic Monitor
## AI Models
- Using Claude for coding assistance
- Interested in reasoning improvements
- Comparing models for different use cases
## Security
- Running production Kubernetes clusters
- Need to patch critical CVEs quickly
- Interested in zero-day disclosures
## Tech News
- Following startup ecosystem
- Interested in developer tools space
- Tracking potential acquisition targets
Alert Channels
Telegram
Requires OpenClaw message tool:
{
"channels": ["telegram"],
"telegram_config": {
"chat_id": "@your_username",
"silent": false,
"effects": {
"high_importance": "🔥",
"medium_importance": "📌"
}
}
}
Discord
Webhook-based:
{
"channels": ["discord"],
"discord_config": {
"webhook_url": "",
"username": "Research Bot",
"avatar_url": ""
}
}
SMTP or API:
{
"channels": ["email"],
"email_config": {
"to": "you@example.com",
"from": "research@yourdomain.com",
"smtp_server": "smtp.gmail.com",
"smtp_port": 587
}
}
Advanced Features
Alert Conditions
Fine-tune when to alert:
{
"alert_on": [
"price_change_10pct",
"keyword_exact_match",
"source_tier_1",
"high_engagement"
],
"ignore_sources": [
"spam-site.com",
"clickbait-news.io"
],
"boost_sources": [
"github.com",
"arxiv.org",
"official-site.com"
]
}
Regex Patterns
Match specific patterns:
{
"patterns": [
"version \\d+\\.\\d+\\.\\d+",
"\\$\\d{1,3}(,\\d{3})*",
"CVE-\\d{4}-\\d+"
]
}
Rate Limiting
Prevent alert fatigue:
{
"settings": {
"max_alerts_per_day": 5,
"max_alerts_per_topic_per_day": 2,
"quiet_hours": {
"start": "22:00",
"end": "08:00"
}
}
}
State Management
.research_state.json
Tracks:
- Last check time per topic
- Alerted URLs (deduplication)
- Importance scores history
- Learning data (if enabled)
Example:
{
"topics": {
"eth-price": {
"last_check": "2026-01-28T22:00:00Z",
"last_alert": "2026-01-28T15:30:00Z",
"alerted_urls": [
""
],
"findings_count": 3,
"alerts_today": 1
}
},
"deduplication": {
"url_hash_map": {
"abc123": "2026-01-28T15:30:00Z"
}
}
}
.findings/ directory
Stores digest-worthy findings:
.findings/
├── 2026-01-22_eth-price.json
├── 2026-01-24_fm26-patches.json
└── 2026-01-27_ai-breakthroughs.json
Best Practices
importance_threshold: medium initially, adjust based on alert quality"keywords": ["AI", "-clickbait", "-spam"]Integration with Other Skills
web-search-plus
Automatically uses intelligent routing:
- Product/price topics → Serper
- Research topics → Tavily
- Company/startup discovery → Exa
personal-analytics
Suggests topics based on conversation patterns:
"You've asked about Rust 12 times this month. Want me to monitor 'Rust language updates'?"
Privacy & Security
- All data local - No external services except search APIs
- State files gitignored - Safe to use in version-controlled workspace
- Memory hints optional - You control what context is shared
- Learning data stays local - Never sent to APIs
Troubleshooting
No alerts being sent:
- Check cron is running:
crontab -l - Verify channel config (Telegram chat ID, Discord webhook)
- Run with
--dry-run --verboseto see scoring
Too many alerts:
- Increase
importance_threshold - Add rate limiting
- Refine keywords (add negative filters)
- Enable learning mode
Missing important news:
- Decrease
importance_threshold - Increase check frequency
- Broaden keywords
- Check
.research_state.jsonfor deduplication issues
Digest not generating:
- Verify
.findings/directory exists and has content - Check digest cron schedule
- Run manually:
python3 scripts/digest.py --preview
Example Workflows
Track Product Release
python3 scripts/manage_topics.py add "iPhone 17 Release" \
--query "iPhone 17 announcement release date" \
--keywords "iPhone 17,Apple event,September" \
--frequency daily \
--importance high \
--channels telegram \
--context "Planning to upgrade from iPhone 13"
Monitor Competitor
python3 scripts/manage_topics.py add "Competitor Analysis" \
--query "CompetitorCo product launch funding" \
--keywords "CompetitorCo,product,launch,Series,funding" \
--frequency weekly \
--importance medium \
--channels discord,email
Research Topic
python3 scripts/manage_topics.py add "Quantum Computing Papers" \
--query "quantum computing arxiv" \
--keywords "quantum,qubit,arxiv" \
--frequency weekly \
--importance low \
--channels email
Credits
Built for ClawHub. Uses web-search-plus skill for intelligent search routing.