LDL Health Plan - Single Documentation

Dark mode doc with setup, architecture, important code, and screenshot gallery (1.jpeg to 13.jpeg) in one HTML file.

1) How to Start Project

Backend

cd "c:\Reaserch\RP 47"
python -m venv venv
venv\Scripts\activate
pip install -r backend\requirements.txt
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000

Frontend (Expo)

npm install
npm start

Set backend URL in .env: EXPO_PUBLIC_API_URL=http://YOUR_IP:8000

2) How LLM Is Trained

3) File Connections

PartConnected To
App API callsutils/api.js -> constants/api.js -> backend endpoints
Backend routesbackend/main.py -> backend/predict.py and backend/llm.py
Guideline statecontext/GuidelineContext.js -> all plan/reminder/motivational screens
LLM runtimebackend/llm.py loads adapter from smol-360m-ldl-guide

4) File Structure

RP 47/
|- App.js
|- constants/api.js
|- utils/api.js
|- context/GuidelineContext.js
|- navigation/MainTabs.js
|- screens/...
|- backend/main.py
|- backend/predict.py
|- backend/llm.py
|- ml/output/ldl_risk_model.joblib
|- train_chatbot_ldl.py
|- smol-360m-ldl-guide/
|- assets/
|- doc/index.html

5) Assets + Doc Paths

App assets: c:\Reaserch\RP 47\assets\

This documentation: c:\Reaserch\RP 47\doc\index.html

Requested screenshot files: c:\Reaserch\RP 47\doc\1.jpeg ... c:\Reaserch\RP 47\doc\13.jpeg

6) Screenshots (One by One) – Application flow

These 13 screenshots are saved in doc/1.jpegdoc/13.jpeg. Each one corresponds to a specific screen or state in the LDL app.

7) Project Screenshots (Development)

Terminal output from guideline and chatbot training runs.

8) Important Code Parts

API base URL

const DEV_API_URL = process.env.EXPO_PUBLIC_API_URL || 'http://192.168.1.1:8000';
export const API_BASE_URL = __DEV__ ? DEV_API_URL : null;

Backend LLM endpoints

@app.post("/guideline")
def api_guideline(req: GuidelineRequest):
    data = llm_module.generate_guideline(req.riskLevel, req.topFactors)
    if data is None:
        raise HTTPException(status_code=503, detail="Guideline LLM unavailable.")
    return data

@app.post("/chat")
def api_chat(req: ChatRequest):
    reply = llm_module.generate_chat(req.message, req.history)
    if reply is None:
        raise HTTPException(status_code=503, detail="Chat LLM unavailable.")
    return {"reply": reply}

Back to top