Alexo19 commited on
Commit
0ca069d
·
verified ·
1 Parent(s): da3a00e

Implement:

Browse files

generate_signal_from_image(image_bytes) -> dict

generate_signal_from_market(symbol: str, timeframe: str) -> dict

1.7 Main FastAPI App

backend/main.py:

Define routes:

GET /api/health → { "status": "ok" }

GET /api/models → list currently configured models and flags (inference/local).

POST /api/analyze/screenshot

Accepts multipart/form-data with file.

Uses image_analysis, timeseries_analysis (if symbol/timeframe can be inferred or optional param), sentiment_analysis, and signal_engine.

POST /api/analyze/market

JSON body: { "symbol": "BTCUSDT", "timeframe": "1h" }.

Returns signal JSON.

POST /api/webhook/signal

Protected by WEBHOOK_API_KEY header X-API-KEY (if configured).

Accepts signal envelope JSON and just echoes / logs / lightly validates.

Use Pydantic models for request/response.

1.8 Root App Entrypoint

app.py:

Create FastAPI app.

Include backend.main router under /api.

Serve static frontend from frontend/dist at root (/) using Starlette StaticFiles.

Respect FRONTEND_BASE_PATH and API_BASE_URL.

===========================================================

2. FRONTEND – React + TypeScript + Vite + Tailwind

Use the given folder/file structure and fill in all code.

2.1 Frontend Setup Files

Generate:

frontend/package.json with:

Vite + React + TypeScript

axios

react-router-dom

Tailwind + PostCSS + autoprefixer

vite.config.ts: configure React + base path if needed.

tailwind.config.js: dark theme friendly.

postcss.config.js

index.html: minimal HTML that mounts React app.

src/index.css: Tailwind base + custom dark theme.

2.2 App Entrypoint

frontend/src/main.tsx:

ReactDOM createRoot.

Wrap <App /> with BrowserRouter.

frontend/src/App.tsx:

Layout:

<Navbar /> at top

<Sidebar /> on left

Main content using routes:

/ → <Dashboard />

/backtest → <Backtest />

/settings → <Settings />

/docs → <Docs />

2.3 API Client

frontend/src/lib/api.ts:

Base URL: use env (import.meta.env.VITE_API_BASE_URL) with fallback to /api.

Functions:

uploadChart(file: File): Promise<SignalResponse>

getMarketSignal(symbol: string, timeframe: string): Promise<SignalResponse>

getModels(): Promise<any>

postWebhookSignal(payload: any): Promise<any>

TypeScript types for SignalResponse using the JSON schema described earlier.

2.4 Core Pages

Dashboard.tsx:

Chart upload card.

Pair selector (symbol + timeframe).

Signal panel (direction, entry zone, SL, TP1/2/3, confidence, explanation).

History table with previous signals.

Backtest.tsx:

Placeholder UI explaining future backtest feature.

Table or card for uploading a CSV or selecting pair/time range (UI only).

Settings.tsx:

Form for:

HF token (display-only note, not actually stored in frontend).

Toggle “Use Inference API vs Local”.

Display currently active models from /api/models.

Docs.tsx:

Developer docs showing available endpoints, payload examples, and webhook format.

2.5 Components

Implement these React components with Tailwind styling, responsive dark UI:

Navbar.tsx: Top nav bar with title CryptoSignal-Sleuth and simple links.

Sidebar.tsx: Navigation links to Dashboard, Backtest, Settings, Docs.

ChartUploadCard.tsx:

Drag-and-drop or file input.

On submit → call uploadChart() and show spinner + result.

PairSelector.tsx:

Symbol dropdown/input (BTCUSDT, ETHUSDT, etc.)

Timeframe select (1m, 5m, 15m, 1h, 4h, 1d).

Button to call getMarketSignal().

SignalPanel.tsx:

Nicely styled card:

Direction (color-coded).

Entry zone (two prices).

Stop loss.

Take profit levels.

Confidence (progress bar or badge).

Time horizon.

Explanation text.

Sentiment summary from meta.sentiment if present.

HistoryTable.tsx:

Simple in-memory list of recent signals generated this session.

ModelSettingsPanel.tsx:

Used in Settings page to display models from /api/models.

===========================================================

3. REQUIREMENTS + ENV

Generate a requirements.txt that includes at least:

fastapi

uvicorn[standard]

python-multipart

pydantic

httpx or requests

numpy

pandas (if needed)

ta (for technical indicators) or your own indicator math

feedparser or similar (for RSS)

jinja2 (if needed)

any Hugging Face client libraries used

Also output .env.example content (HF_TOKEN, USE_INFERENCE_API, INFERENCE_LLM_MODEL, LOCAL_LLM_MODEL, FRONTEND_BASE_PATH, API_BASE_URL, WEBHOOK_API_KEY).

===========================================================

4. HUGGING FACE SPACES DEPLOYMENT

Explain and/or generate necessary config so that:

app.py is the entrypoint for a Python Space.

On start:

Backend is available at /api.

Frontend is served from built frontend/dist.

If build is not pre-committed, show a postStart script example:

cd frontend
npm install
npm run build
cd ..


===========================================================

5. AUTOMATION: n8n + Discord + Telegram

At the end of the code output, ALSO include:

5.1 n8n Workflow (JSON)

A ready-to-import n8n workflow that:

Listens to a webhook from /api/webhook/signal or external system.

Formats the signal.

Sends it to:

Discord channel (via webhook)

Telegram chat (via bot API)

Optionally appends it to Google Sheets.

5.2 Discord Bot Snippet (Python)

A minimal bot that:

Responds to /signal BTCUSDT 5m by calling /api/analyze/market.

Formats and sends back the signal.

5.3 Telegram Bot Snippet (Python)

A minimal bot that:

Responds to /signal BTCUSDT 5m.

Calls /api/analyze/market.

Replies with nicely formatted text of direction, entry, SL, TPs, confidence.

===========================================================

6. FINAL SUMMARY

At the very end, output a short summary:

How the repo is structured.

How to run locally:

pip install -r requirements.txt

cd frontend && npm install && npm run build

uvicorn app:app --host 0.0.0.0 --port 7860

How to deploy to Hugging Face Spaces.

===========================================================

Now generate the COMPLETE, PRODUCTION-READY CODEBASE with all files filled out, no pseudo-code, no TODOs, and consistent imports. Use clear, clean, maintainable code.


---

## ⚙️ What You Do Next

1. Open that DeepSite link.
2. Paste this whole prompt into the big text box.
3. Let it generate the **repo/code bundle**.
4. Download the code, push it to your Hugging Face Space (or update existing Space files).
5. Add your `HF_TOKEN` in the Space’s **Secrets**.

If you want, next step I can help you:
- Wire this EXACT Space with **n8n flows** you already have,
- Or design a **Tradvio-style UI theme** with your ETHERNAL/ETERNAL branding.
::contentReference[oaicite:0]{index=0}

.env.example ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```
2
+ # Hugging Face Settings
3
+ HF_TOKEN=your_hf_token_here
4
+ USE_INFERENCE_API=1
5
+ INFERENCE_LLM_MODEL=Qwen/Qwen2.5-7B-Instruct
6
+ LOCAL_LLM_MODEL=google/flan-t5-base
7
+
8
+ # Path Settings
9
+ FRONTEND_BASE_PATH=/
10
+ API_BASE_URL=/api
11
+
12
+ # Security
13
+ WEBHOOK_API_KEY=your_secure_api_key_here
14
+ ```
15
+
16
+ 6. For the frontend, let's create a basic setup:
app.py ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```python
2
+ from fastapi import FastAPI
3
+ from fastapi.staticfiles import StaticFiles
4
+ from backend.main import app as api_app
5
+ from backend.config import get_settings
6
+ import os
7
+
8
+ settings = get_settings()
9
+
10
+ app = FastAPI(title="CryptoSignal Sleuth Pro")
11
+
12
+ # Mount the API under /api
13
+ app.mount(settings.API_BASE_URL, api_app)
14
+
15
+ # Serve static files from frontend/dist
16
+ app.mount(
17
+ settings.FRONTEND_BASE_PATH,
18
+ StaticFiles(directory="frontend/dist", html=True),
19
+ name="static"
20
+ )
21
+
22
+ if __name__ == "__main__":
23
+ import uvicorn
24
+ uvicorn.run(app, host="0.0.0.0", port=7860)
25
+ ```
26
+
27
+ 4. Let's create the requirements.txt:
backend/main.py CHANGED
@@ -66,14 +66,56 @@ async def webhook_handler(
66
  ):
67
  # Process webhook payload here
68
  return {"status": "received", "data": payload}
 
 
 
69
 
70
  @app.get("/api/models")
71
  async def list_models():
72
  settings = get_settings()
73
  return {
74
  "llm_model": settings.INFERENCE_LLM_MODEL if settings.USE_INFERENCE_API == "1" else settings.LOCAL_LLM_MODEL,
75
- "using_inference_api": settings.USE_INFERENCE_API == "1"
 
 
 
 
76
  }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
77
  ```
78
  ___METADATA_START___
79
  {"repoId":"Alexo19/cryptosignal-sleuth-pro","isNew":false,"userName":"Alexo19"}
 
66
  ):
67
  # Process webhook payload here
68
  return {"status": "received", "data": payload}
69
+ @app.get("/api/health")
70
+ async def health_check():
71
+ return {"status": "ok"}
72
 
73
  @app.get("/api/models")
74
  async def list_models():
75
  settings = get_settings()
76
  return {
77
  "llm_model": settings.INFERENCE_LLM_MODEL if settings.USE_INFERENCE_API == "1" else settings.LOCAL_LLM_MODEL,
78
+ "using_inference_api": settings.USE_INFERENCE_API == "1",
79
+ "models": {
80
+ "inference": settings.INFERENCE_LLM_MODEL,
81
+ "local": settings.LOCAL_LLM_MODEL
82
+ }
83
  }
84
+
85
+ @app.post("/api/analyze/screenshot")
86
+ async def analyze_screenshot(
87
+ file: UploadFile = File(...),
88
+ symbol: Optional[str] = None,
89
+ timeframe: Optional[str] = None
90
+ ):
91
+ if not file.content_type.startswith("image/"):
92
+ raise HTTPException(400, detail="File must be an image")
93
+
94
+ image_bytes = await file.read()
95
+ signal = await generate_signal_from_image(image_bytes, symbol or "BTCUSDT")
96
+ return signal
97
+
98
+ @app.post("/api/analyze/market")
99
+ async def analyze_market(data: dict):
100
+ try:
101
+ symbol = data.get("symbol", "BTCUSDT")
102
+ timeframe = data.get("timeframe", "1h")
103
+ return await generate_signal_from_market(symbol, timeframe)
104
+ except Exception as e:
105
+ raise HTTPException(500, detail=str(e))
106
+
107
+ @app.post("/api/webhook/signal")
108
+ async def webhook_signal(
109
+ payload: dict,
110
+ api_key: str = Depends(verify_api_key)
111
+ ):
112
+ # Basic validation
113
+ if not payload.get("direction"):
114
+ raise HTTPException(400, detail="Missing required field: direction")
115
+
116
+ # Log the signal
117
+ print(f"Received signal: {payload}")
118
+ return {"status": "received", "signal": payload}
119
  ```
120
  ___METADATA_START___
121
  {"repoId":"Alexo19/cryptosignal-sleuth-pro","isNew":false,"userName":"Alexo19"}
backend/signal_engine.py CHANGED
@@ -64,9 +64,59 @@ Provide your response in this exact JSON format:
64
  "sentiment": {{}}
65
  }}
66
  }}"""
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
67
 
68
  def parse_llm_response(response: str, chart_features: Dict, technicals: Dict, sentiment: Dict) -> Dict[str, Any]:
69
- try:
70
  data = json.loads(response.strip())
71
  if not isinstance(data, dict):
72
  raise ValueError("Invalid response format")
 
64
  "sentiment": {{}}
65
  }}
66
  }}"""
67
+ async def generate_signal_from_market(symbol: str, timeframe: str) -> Dict[str, Any]:
68
+ """Generate trading signal from live market data."""
69
+ # Get technical analysis
70
+ ohlcv = await fetch_klines(symbol, timeframe)
71
+ technicals = compute_technicals(ohlcv)
72
+
73
+ # Get sentiment analysis
74
+ sentiment = await get_crypto_sentiment(symbol[:3]) # Extract base symbol
75
+
76
+ # Prepare LLM prompt
77
+ prompt = build_market_prompt(technicals, sentiment, symbol, timeframe)
78
+
79
+ # Get LLM reasoning
80
+ llm_response = await model_registry.llm_reason(prompt)
81
+
82
+ # Parse response into structured format
83
+ return parse_llm_response(llm_response, {}, technicals, sentiment)
84
+
85
+ def build_market_prompt(technicals: Dict, sentiment: Dict, symbol: str, timeframe: str) -> str:
86
+ return f"""Analyze this crypto trading situation and provide a professional trading signal in JSON format:
87
+
88
+ Technical Indicators ({timeframe} timeframe):
89
+ - Trend: {technicals['trend']}
90
+ - Momentum: {technicals['momentum']}
91
+ - Volatility: {technicals['volatility']}
92
+ - Support: {technicals['support']}
93
+ - Resistance: {technicals['resistance']}
94
+
95
+ Market Sentiment:
96
+ - Score: {sentiment.get('score', 0)}
97
+ - Bullish: {sentiment.get('bullish', 0)}
98
+ - Bearish: {sentiment.get('bearish', 0)}
99
+
100
+ Symbol: {symbol}
101
+
102
+ Provide your response in this exact JSON format:
103
+ {{
104
+ "direction": "long|short|neutral",
105
+ "entry_zone": [min_price, max_price],
106
+ "stop_loss": price,
107
+ "take_profit_levels": [tp1, tp2, tp3],
108
+ "timeframe": "{timeframe}",
109
+ "confidence": 0-100,
110
+ "time_horizon": "intra-day|swing|position",
111
+ "explanation": "brief rationale",
112
+ "meta": {{
113
+ "sources": ["technicals", "sentiment"],
114
+ "sentiment": {{}}
115
+ }}
116
+ }}"""
117
 
118
  def parse_llm_response(response: str, chart_features: Dict, technicals: Dict, sentiment: Dict) -> Dict[str, Any]:
119
+ try:
120
  data = json.loads(response.strip())
121
  if not isinstance(data, dict):
122
  raise ValueError("Invalid response format")
frontend/tailwind.config.js ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ module.exports = {
2
+ content: [
3
+ "./index.html",
4
+ "./src/**/*.{js,ts,jsx,tsx}",
5
+ ],
6
+ theme: {
7
+ extend: {
8
+ colors: {
9
+ primary: {
10
+ 500: '#6366f1',
11
+ },
12
+ secondary: {
13
+ 500: '#ec4899',
14
+ }
15
+ }
16
+ },
17
+ },
18
+ darkMode: 'class',
19
+ plugins: [],
20
+ }
frontend/vite.config.ts ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```typescript
2
+ import { defineConfig } from 'vite'
3
+ import react from '@vitejs/plugin-react'
4
+ import { resolve } from 'path'
5
+
6
+ export default defineConfig({
7
+ plugins: [react()],
8
+ resolve: {
9
+ alias: {
10
+ '@': resolve(__dirname, './src'),
11
+ },
12
+ },
13
+ base: process.env.VITE_BASE_PATH || '/',
14
+ server: {
15
+ port: 3000,
16
+ },
17
+ })
18
+ ```
requirements.txt ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```
2
+ fastapi==0.95.2
3
+ uvicorn[standard]==0.22.0
4
+ python-multipart==0.0.6
5
+ pydantic==1.10.7
6
+ requests==2.28.2
7
+ numpy==1.24.3
8
+ pandas==2.0.1
9
+ feedparser==6.0.10
10
+ huggingface-hub==0.15.1
11
+ transformers==4.29.2
12
+ python-dotenv==1.0.0
13
+ ```
14
+
15
+ 5. Let's create the .env.example file: