Technical Workflow
Last updated
Last updated
Below is a detailed breakdown of the architecture diagram—which shows how the user-facing web app connects through a GraphQL API to various backend services, AI components, and infrastructure layers:
Web App (React / Next.js)
This is the frontend that end users interact with. Built in React/Next.js, it handles all UI rendering, routing, and initial data fetching.
API (GraphQL)
The frontend sends all requests (queries and mutations) to a GraphQL endpoint.
GraphQL unifies data fetching, aggregation, and mutating operations, acting as the bridge between the web app and backend services.
Once a GraphQL request arrives, it fans out to multiple backend modules:
DBMS (MongoDB / PostgreSQL)
All persistent data—user profiles, project scores, configuration settings, historical analytics—is stored in either a document store (MongoDB) or a relational database (PostgreSQL), depending on the use case.
Authentication (NextAuth)
Handles user sign-up, login, session management, and OAuth flows. NextAuth sits alongside the API to secure endpoints and manage JWT tokens or session cookies.
AI Integration (GoLang / OpenAI / Gemini)
Core AI logic (e.g., contract analysis, social scoring, AI autofill for onboarding) is implemented in a microservice written in Go.
This service calls out to OpenAI (o4-mini) or Google Gemini models for tasks such as:
Contract “explanation” / risk-scoring
Social sentiment analysis
Autofilling user configurations during onboarding
Real-Time Updates (WebSockets)
A WebSocket server broadcasts live data (price spikes, sell signals, social alerts, etc.) back to the frontend.
Whenever a monitored project hits a trigger—say, a sudden dump in chart data or plummeting community engagement—the WebSocket pushes an immediate notification to the user’s dashboard.
Wallet (Non-custodial built-in wallet)
Manages users’ on-chain interactions (sending transactions, approving smart contracts, executing auto-buys/auto-sells).
Implemented so users never have to leave the Unreality interface; private keys remain client-side, and transaction signing occurs locally in the browser.
These components represent specialized AI workloads that support both “assistant” features and data processing pipelines:
AI Assistant & AI Autofill (Fine-tuned o4-mini)
Powers natural-language interactions in the UI (e.g., “Help me set my buy filters,” “What’s the risk score for this token?”).
Behind the scenes, it’s a version of the o4-mini model that’s been fine-tuned on Unreality’s domain data (smart contracts, Web3 terminology, risk-assessment examples).
Data Processing (Gemini 2.5 / OpenAI fine-tuned o4-mini)
Handles large-scale analysis jobs—parsing social posts, extracting semantic signals from whitepapers, detecting honeypots in tokenomics.
Runs periodically or on demand to refresh scores for every monitored project. Gemini (v2.5) or an OpenAI model fine-tuned on Unreality’s proprietary dataset is used to classify “spam vs. legit” and generate the multi-factor project score.
Finally, everything above runs on a scalable hosting and proxy layer:
Hetzner / Azure VDS (Hosting)
Virtual dedicated servers scattered across multiple regions to reduce latency for scraping, AI inference, and blockchain RPC calls.
Ensures high availability for both backend services and real-time components.
Oxylabs / BrightData (Scraping Proxies)
All social-media and on-chain scraping is routed through rotating proxy endpoints provided by Oxylabs or BrightData.
Prevents IP bans and throttling when Unreality’s scraper harvests data from Telegram, Discord, X, Reddit, or various blockchain explorers.
How a Typical Request Flows
User Action (Frontend):
The trader logs in (via NextAuth) and requests a new “project score” by entering a contract address in the React/Next.js UI.
GraphQL Request:
The frontend sends a GraphQL mutation like scoreProject(contractAddress: String!)
.
Backend Resolution:
The GraphQL API routes that request to the AI Integration service.
AI Integration spawns a workflow that:
Calls out to the contract-analysis endpoint (Go microservice) to fetch bytecode, run heuristic checks, and compute an initial risk score.
Invokes “social scoring” pipelines to scrape X/Twitter, Telegram, Reddit, Discord via Oxylabs proxies.
Aggregates website/WHOIS lookups (pulling domain data and tech-stack info).
Calls Gemini or OpenAI models to perform NLP classification on whitepapers and social posts.
Data Aggregation & Scoring:
Once the AI Integration service collects every sub-score (contract risk, social signals, website health, team credibility, etc.), it generates a unified “project score” object and writes it to the DBMS.
API Response & UI Update:
The GraphQL API returns the new score to the frontend.
If the user has a real-time subscription (via WebSockets), the dashboard panel updates immediately—showing a color-coded risk level, component breakdown, and any “no-go” flags.
Ongoing Monitoring:
Separately, a background job continues to run scraping and on-chain watchers.
If any live metric (e.g., whale dump, zero Telegram messages in the last hour, sudden negative sentiment spike) triggers a “sell” rule, the system:
Emits a WebSocket event alerting the user.
Invokes the Wallet service to auto-execute a sell transaction if the user has enabled “auto-sell.”
This layered architecture ensures:
Modularity: Each component (DB, Auth, AI, Real-Time, Wallet) can scale or update independently.
Performance: Distributed hosting + proxy network speeds up scraping and AI inference.
Flexibility: GraphQL makes it straightforward to add new data sources (e.g., a new chain or social platform) without rewriting the entire API.
Security: Non-custodial wallet design keeps user keys client-side, while NextAuth handles secure sessions.
AI-Driven Intelligence: Fine-tuned o4-mini and Gemini models power both user-facing assistance and deep data processing routines.
In short, this diagram illustrates how Unreality’s frontend, API, backend microservices, AI modules, and infrastructure collaborate to deliver a fully automated, 24/7 crypto-investing companion.
Hetzner and Azure: the most reliable and powerful Virtual Dedicated Server providers ensuring the best performances around the globe for the web app and the backend.
Brightdata and Oxylabs: the most known proxy providers offering the best scraping performances