Jake Shore f3c4cd817b Add all MCP servers + factory infra to MCPEngine — 2026-02-06
=== NEW SERVERS ADDED (7) ===
- servers/closebot — 119 tools, 14 modules, 4,656 lines TS (Stage 7)
- servers/google-console — Google Search Console MCP (Stage 7)
- servers/meta-ads — Meta/Facebook Ads MCP (Stage 8)
- servers/twilio — Twilio communications MCP (Stage 8)
- servers/competitor-research — Competitive intel MCP (Stage 6)
- servers/n8n-apps — n8n workflow MCP apps (Stage 6)
- servers/reonomy — Commercial real estate MCP (Stage 1)

=== FACTORY INFRASTRUCTURE ADDED ===
- infra/factory-tools — mcp-jest, mcp-validator, mcp-add, MCP Inspector
  - 60 test configs, 702 auto-generated test cases
  - All 30 servers score 100/100 protocol compliance
- infra/command-center — Pipeline state, operator playbook, dashboard config
- infra/factory-reviews — Automated eval reports

=== DOCS ADDED ===
- docs/MCP-FACTORY.md — Factory overview
- docs/reports/ — 5 pipeline evaluation reports
- docs/research/ — Browser MCP research

=== RULES ESTABLISHED ===
- CONTRIBUTING.md — All MCP work MUST go in this repo
- README.md — Full inventory of 37 servers + infra docs
- .gitignore — Updated for Python venvs

TOTAL: 37 MCP servers + full factory pipeline in one repo.
This is now the single source of truth for all MCP work.
2026-02-06 06:32:29 -05:00

109 lines
3.8 KiB
YAML

name: MCP HTTP Protocol Validation
on:
pull_request:
branches: [ main, master ]
workflow_dispatch: # Allow manual triggering
jobs:
validate-http:
name: Validate HTTP MCP Server
runs-on: ubuntu-latest
strategy:
matrix:
protocol-version: ["2025-03-26"] # Just test the latest version in MVP
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .
pip install pytest pytest-html
- name: Run HTTP Compliance Tests
id: http-test
run: |
mkdir -p reports
# Start the HTTP server in background
echo "Starting HTTP server..."
python ref_http_server/reference_mcp_server.py &
SERVER_PID=$!
# Wait for server to start
echo "Waiting for server to start..."
sleep 5
# Run HTTP compliance tests
echo "Running compliance tests..."
python -m mcp_testing.scripts.http_compliance_test \
--protocol-version ${{ matrix.protocol-version }} \
--output-dir reports
# Kill the server
echo "Stopping HTTP server..."
kill $SERVER_PID || true
- name: Upload test reports
uses: actions/upload-artifact@v4
with:
name: mcp-http-reports
path: reports/
retention-days: 14
- name: Post summary to PR
if: github.event_name == 'pull_request'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs');
try {
const reportFiles = fs.readdirSync('reports');
const jsonReportFile = reportFiles.find(file => file.endsWith('.json'));
let summary = "## MCP HTTP Validation Results\n\n";
if (jsonReportFile) {
const reportData = JSON.parse(fs.readFileSync(`reports/${jsonReportFile}`, 'utf8'));
summary += `- Protocol Version: ${reportData.protocol_version || 'Unknown'}\n`;
summary += `- Success Rate: ${reportData.success_rate || 'Unknown'}\n`;
summary += `- Tests Run: ${reportData.total_tests || 0}\n`;
// Add failed tests if any
const failedTests = (reportData.test_cases || []).filter(tc => tc.status === 'failed');
if (failedTests.length > 0) {
summary += `\n### Failed Tests (${failedTests.length}):\n\n`;
failedTests.forEach(test => {
summary += `- ${test.name}: ${test.error_message || 'No error message'}\n`;
});
}
} else {
summary += "No test report found.\n";
}
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: summary
});
} catch (error) {
console.error('Error creating PR comment:', error);
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: "## MCP HTTP Validation\n\nError generating test results summary. Check the workflow logs for details."
});
}