Webspark is a powerful web automation tool that records user interactions on websites and analyzes Tealium tag implementations. It features a modern, intuitive interface for recording macros and performing comprehensive tag analysis.
- π― Macro Recording: Record user interactions on any website
- π Tealium Analysis: Analyze tag implementations and track events
- π¨ Modern UI: Clean, responsive interface with dark theme
- π Real-time Streaming: Live analysis with progress tracking
- πΎ Data Export: Export analysis results and macro data
- π Fast Performance: Built with FastAPI and modern web technologies
- Backend: Python 3.8+, FastAPI, uvicorn
- Browser Automation: Playwright for webpage analysis
- Frontend: HTML5, CSS3, JavaScript ES6+
- Real-time: Server-Sent Events (SSE)
- UI Framework: Custom CSS with Font Awesome icons
- Python 3.8+ (required)
- Git for repository cloning
- 4GB+ RAM for browser automation
- Modern web browser (Chrome/Chromium recommended)
Important: Use the webspark environment name for consistency:
# Create virtual environment (DO NOT use conda)
python -m venv webspark
# Activate environment
# Windows:
webspark\Scripts\activate
# macOS/Linux:
source webspark/bin/activate# Install Python dependencies
pip install -r requirements.txt
# Install Playwright browsers
python -m playwright install chromium
# Verify installation
python -c "from playwright.sync_api import sync_playwright; print('β
Playwright ready')"# Start the server
python app.pyπ Access the application:
- Homepage:
http://localhost:5000 - Macro Recorder:
http://localhost:5000/record
webspark/
βββ app.py # Main FastAPI application
βββ analyzers/ # Analysis modules
β βββ tealium_manual_analyzer.py
β βββ macro_tealium_analyzer.py
βββ core/ # Core functionality
β βββ macro_recorder.py
βββ static/ # Frontend assets
β βββ *.css # Stylesheets
β βββ recorder.js # Main JavaScript
β βββ macros.css # Macro cards styling
β βββ images/ # Static images
βββ templates/ # HTML templates
β βββ index.html # Homepage
β βββ record.html # Recording page
βββ data/ # Generated data (gitignored)
β βββ macros/ # Saved macro files
β βββ *_analysis.json # Analysis results
βββ requirements.txt # Python dependencies
βββ selectors_config.py # Element selector configuration
βββ .gitignore # Git ignore rules
- Navigate to recorder:
http://localhost:5000/record - Enter target URL: Input the website you want to record
- Start recording: Click "Start Recording" button
- Interact with website: Click links, buttons, forms, etc.
- Stop recording: Click "Stop Recording" to save the macro
- Find saved macro: Check the "Saved Macros" section
- Click analyze: Press the green "Analyze" button
- Watch progress: Real-time analysis with progress tracking
- Review results: Detailed Tealium event analysis and vendor detection
- Analysis results are automatically saved to
data/directory - Download buttons available in the UI for specific reports
- Macro data stored as JSON files in
data/macros/
Create a .env file for custom settings:
# Application settings
DEBUG=False
HOST=0.0.0.0
PORT=5000
# Browser settings
BROWSER_HEADLESS=True
ANALYSIS_TIMEOUT=300
# Logging
LOG_LEVEL=INFOModify selectors_config.py to customize analysis targets:
SELECTORS_CONFIG = {
"affiliate_links": ".affiliate-buttons a",
"add_to_cart": "[data-testid='add-to-cart'], .add-to-cart",
"checkout": ".checkout-button, [href*='checkout']"
}π§ Browser not found:
python -m playwright install chromiumπ§ Port already in use: The app automatically tries ports 5000, 5001, 5002.
π§ Permission errors (Windows): Run command prompt as Administrator for Playwright installation.
π§ Virtual environment issues:
# Deactivate and recreate
deactivate
rmdir /s webspark # Windows
# rm -rf webspark # macOS/Linux
python -m venv webspark
webspark\Scripts\activate
pip install -r requirements.txtEnable detailed logging:
# Windows
set LOG_LEVEL=DEBUG
python app.py
# macOS/Linux
LOG_LEVEL=DEBUG python app.py# Install production server
pip install uvicorn[standard]
# Run production server
uvicorn app:app --host 0.0.0.0 --port 5000 --workers 4pip install gunicorn
gunicorn app:app --bind 0.0.0.0:5000 --workers 4- Python: Follow PEP 8 guidelines
- JavaScript: ES6+ features, async/await preferred
- CSS: CSS custom properties for theming
- Backend changes: Modify
app.pyandanalyzers/ - Frontend changes: Update
static/andtemplates/ - Test thoroughly with different websites
- Update documentation
MIT License - see LICENSE file for details.
- Fork the repository
- Create feature branch:
git checkout -b feature-name - Make changes and test
- Commit:
git commit -m 'Add feature' - Push:
git push origin feature-name - Submit pull request
- Use
websparkvirtual environment name for consistency - DO NOT use conda - use standard Python venv
- Respect website terms of service when recording macros
- Screenshots and macro data are excluded from git commits