🦜×

HuangtingFlux × LangChain

Use langchain-mcp-adapters to connect the Huangting Protocol three-stage SOP to your LangChain Agent and automatically reduce token usage by 40%.

✓ No Auth Required✓ Streamable HTTP✓ LangGraph Compatible✓ MCP 2025-12-11

1

Install Dependencies

$ pip install langchain-mcp-adapters langchain-openai

HuangtingFlux is a standard remote MCP server. Connect via langchain-mcp-adapters — no additional SDK needed.

2

Quick Start

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain.agents import create_react_agent
from langchain_openai import ChatOpenAI

async def main():
    # Connect to HuangtingFlux MCP server
    client = MultiServerMCPClient(
        {
            "huangting": {
                "transport": "http",
                "url": "https://mcp.huangting.ai/mcp",
            }
        }
    )

    # Load the three-stage SOP tools:
    # start_task, report_step_result, finalize_and_report, get_network_stats
    tools = await client.get_tools()
    print(f"Loaded tools: {[t.name for t in tools]}")

    llm = ChatOpenAI(model="gpt-4o", temperature=0)
    agent = create_react_agent(llm, tools)

    # Agent will automatically follow the Huangting Protocol three-stage SOP
    response = await agent.ainvoke({
        "messages": [{
            "role": "user",
            "content": "Analyze the core differences between LangChain and CrewAI frameworks."
        }]
    })
    print(response)

if __name__ == "__main__":
    asyncio.run(main())
💡

The Agent will automatically call start_task at the beginning, report_step_result after each step, and finalize_and_report at the end — following the mandatory three-stage SOP.

3

Stateful Session (Recommended for Long Tasks)

For tasks that span multiple steps, use client.session() to create a persistent session and ensure task_id remains consistent throughout the task lifecycle.

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_mcp_adapters.tools import load_mcp_tools
from langchain.agents import create_react_agent
from langchain_openai import ChatOpenAI

async def main():
    client = MultiServerMCPClient({
        "huangting": {
            "transport": "http",
            "url": "https://mcp.huangting.ai/mcp",
        }
    })

    # Use a stateful session to maintain context across tool calls
    async with client.session("huangting") as session:
        tools = await load_mcp_tools(session)
        llm = ChatOpenAI(model="gpt-4o")
        agent = create_react_agent(llm, tools)

        result = await agent.ainvoke({
            "messages": [{
                "role": "user",
                "content": "Research the current state of the MCP protocol ecosystem."
            }]
        })
        print(result)

asyncio.run(main())
4

LangGraph Integration

HuangtingFlux is fully compatible with LangGraph. Use langgraph.prebuilt.create_react_agent to build more complex agent graph structures.

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

async def main():
    client = MultiServerMCPClient({
        "huangting": {
            "transport": "http",
            "url": "https://mcp.huangting.ai/mcp",
        }
    })

    tools = await client.get_tools()
    model = ChatOpenAI(model="gpt-4o")

    # Build a more complex agent graph using LangGraph
    agent = create_react_agent(model, tools)

    result = await agent.ainvoke({
        "messages": [{
            "role": "user",
            "content": "Compare the API pricing strategies of GPT-4o and Claude 3.7."
        }]
    })

    for msg in result["messages"]:
        print(f"[{msg.type}] {msg.content[:200]}")

asyncio.run(main())
5

Multi-Server Mode (HuangtingFlux as SOP Layer)

Use HuangtingFlux as the SOP optimization layer for all agent workflows alongside other tool servers. HuangtingFlux handles token management; other servers provide domain tools.

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

async def main():
    # Connect multiple MCP servers — HuangtingFlux as the SOP optimization layer
    client = MultiServerMCPClient({
        "huangting": {
            "transport": "http",
            "url": "https://mcp.huangting.ai/mcp",
            # No authentication required
        },
        # Add other MCP tool servers here
        # "your_tool_server": {
        #     "transport": "http",
        #     "url": "https://your-tool-server.com/mcp",
        #     "headers": {"Authorization": "Bearer YOUR_TOKEN"},
        # },
    })

    tools = await client.get_tools()
    model = ChatOpenAI(model="gpt-4o")
    agent = create_react_agent(model, tools)

    result = await agent.ainvoke({
        "messages": [{"role": "user", "content": "Start a multi-step research task"}]
    })
    print(result)

asyncio.run(main())

Tool Reference4 MCP Tools Provided by HuangtingFlux

start_taskStage 1

Task start phase: Compresses the input prompt, saving 30–60% tokens, returns a compressed task brief.

Parameters

task_description: str, task_type: str (optional)

Returns

compressed_brief, baseline_tokens, task_id

report_step_resultStage 2

Step reporting phase: Call after each sub-step to generate a rolling summary replacing full conversation history.

Parameters

task_id: str, step_number: int, step_result: str

Returns

rolling_summary, tokens_used_this_step

finalize_and_reportStage 3

Task end phase: Refines the final output and generates a verifiable token-saving performance report.

Parameters

task_id: str, final_output: str

Returns

refined_output, performance_report (with savings ratio and token comparison)

get_network_statsStage

Query real-time global stats: total connected agents, cumulative tokens saved, task type distribution.

Parameters

None

Returns

total_reports, total_tokens_saved, average_savings_ratio

Connection Info

MCP Endpoint

https://mcp.huangting.ai/mcp

Transport

Streamable HTTP (MCP 2025-12-11)

Authentication

None required (public access)

GitHub

XianDAO-Labs/huangting-flux-hub

Ready to Start?

View the full protocol documentation or start integrating now