LangGraph
ChatOpenAI-compatible client for seamless integration with LangChain and LangGraph frameworks.
Installation
To use the LangGraph integration, install Sequrity with the langgraph dependency group:
This installs the required langgraph package along with Sequrity.
API Reference
LangChain/LangGraph integration for Sequrity Control.
This module provides a ChatOpenAI-compatible client that routes requests through Sequrity's secure orchestrator with automatic session management and security features.
Example
from sequrity.control.integrations.langgraph import create_sequrity_langgraph_client
from sequrity.control import FeaturesHeader, SecurityPolicyHeader
# Create client with Sequrity security features
llm = create_sequrity_langgraph_client(
sequrity_api_key="your-sequrity-key",
features=FeaturesHeader.dual_llm(),
security_policy=SecurityPolicyHeader.dual_llm()
)
# Use with LangChain
response = llm.invoke([{"role": "user", "content": "Hello!"}])
# Use with LangGraph
from langgraph.graph import StateGraph
llm_with_tools = llm.bind_tools([...])
# ... build your graph
Classes:
-
LangGraphChatSequrityAI–ChatOpenAI client configured to route requests through Sequrity's secure orchestrator.
Functions:
-
create_sequrity_langgraph_client–Create a ChatOpenAI-compatible client with Sequrity security features for LangGraph.
LangGraphChatSequrityAI
LangGraphChatSequrityAI(
sequrity_api_key: str,
features: FeaturesHeader | None = None,
security_policy: SecurityPolicyHeader | None = None,
fine_grained_config: FineGrainedConfigHeader | None = None,
service_provider: LlmServiceProvider | LlmServiceProviderStr = OPENROUTER,
llm_api_key: str | None = None,
base_url: str | None = None,
endpoint_type: EndpointType | str = CHAT,
model: str = "gpt-4",
**kwargs: Any,
)
ChatOpenAI client configured to route requests through Sequrity's secure orchestrator.
This client is a drop-in replacement for ChatOpenAI that automatically: - Adds Sequrity security headers (features, policies, configuration) - Tracks session IDs across multiple requests - Routes to Sequrity's API endpoint
The client maintains session state across multiple chat completion requests, which is essential for Sequrity's dual-LLM architecture to maintain context.
Parameters:
-
(sequrity_api_keystr) –Sequrity API key (used as bearer token)
-
(featuresFeaturesHeader | None, default:None) –Security features configuration (LLM mode, taggers, constraints)
-
(security_policySecurityPolicyHeader | None, default:None) –Security policy configuration (SQRT/Cedar policies)
-
(fine_grained_configFineGrainedConfigHeader | None, default:None) –Fine-grained session configuration
-
(service_providerLlmServiceProvider | LlmServiceProviderStr, default:OPENROUTER) –LLM service provider (
LlmServiceProviderenum or string literal) -
(llm_api_keystr | None, default:None) –Optional API key for the LLM provider
-
(base_urlstr | None, default:None) –Sequrity base URL (default: https://api.sequrity.ai)
-
(endpoint_typeEndpointType | str, default:CHAT) –Endpoint type (chat, code, lang-graph). Defaults to chat.
-
(modelstr, default:'gpt-4') –Model name to use (default: gpt-4)
-
(**kwargsAny, default:{}) –Additional arguments passed to ChatOpenAI
Example
features = FeaturesHeader.dual_llm()
llm = LangGraphChatSequrityAI(
sequrity_api_key="your-key",
features=features,
)
# Session ID is automatically tracked
response1 = llm.invoke([{"role": "user", "content": "Hello"}])
response2 = llm.invoke([{"role": "user", "content": "Continue"}]) # Uses same session
Methods:
-
reset_session–Reset the session ID, starting a new conversation context.
-
set_session_id–Manually set the session ID.
Attributes:
-
session_id(str | None) –Get the current session ID, if any.
reset_session
Reset the session ID, starting a new conversation context.
Call this method when you want to start a fresh conversation without carrying over context from previous requests.
set_session_id
set_session_id(session_id: str | None) -> None
Manually set the session ID.
Use this to resume a previous conversation or share sessions across clients.
Parameters:
-
(session_idstr | None) –Session ID to use, or None to clear
create_sequrity_langgraph_client
create_sequrity_langgraph_client(
sequrity_api_key: str,
features: FeaturesHeader | None = None,
security_policy: SecurityPolicyHeader | None = None,
fine_grained_config: FineGrainedConfigHeader | None = None,
service_provider: LlmServiceProvider | LlmServiceProviderStr = OPENROUTER,
llm_api_key: str | None = None,
base_url: str | None = None,
endpoint_type: EndpointType | str = CHAT,
model: str = "gpt-4",
**kwargs: Any,
) -> LangGraphChatSequrityAI
Create a ChatOpenAI-compatible client with Sequrity security features for LangGraph.
This is a convenience factory function that creates a LangGraphChatSequrityAI instance configured to route requests through Sequrity's secure orchestrator.
Parameters:
-
(sequrity_api_keystr) –Sequrity API key (required)
-
(featuresFeaturesHeader | None, default:None) –Security features configuration (LLM mode, taggers, etc.)
-
(security_policySecurityPolicyHeader | None, default:None) –Security policy configuration (SQRT/Cedar policies)
-
(fine_grained_configFineGrainedConfigHeader | None, default:None) –Fine-grained session configuration
-
(service_providerLlmServiceProvider | LlmServiceProviderStr, default:OPENROUTER) –LLM service provider (
LlmServiceProviderenum or string literal) -
(llm_api_keystr | None, default:None) –Optional API key for the LLM provider
-
(base_urlstr | None, default:None) –Sequrity base URL (default: https://api.sequrity.ai)
-
(endpoint_typeEndpointType | str, default:CHAT) –Endpoint type (chat, code, lang-graph). Defaults to chat.
-
(modelstr, default:'gpt-4') –Model name to use (default: gpt-4)
-
(**kwargsAny, default:{}) –Additional arguments passed to ChatOpenAI
Returns:
-
LangGraphChatSequrityAI–Configured LangGraphChatSequrityAI client instance
Example
from sequrity.control.integrations.langgraph import create_sequrity_langgraph_client
from sequrity.control import FeaturesHeader
# Basic usage with dual-LLM
llm = create_sequrity_langgraph_client(
sequrity_api_key="your-key",
features=FeaturesHeader.dual_llm()
)
# With security policy
from sequrity.control import SecurityPolicyHeader
llm = create_sequrity_langgraph_client(
sequrity_api_key="your-key",
features=FeaturesHeader.dual_llm(),
security_policy=SecurityPolicyHeader.dual_llm()
)
# Use with LangChain
response = llm.invoke([{"role": "user", "content": "Hello!"}])
# Use with LangGraph
from langgraph.graph import StateGraph
from langgraph.prebuilt import ToolNode
llm_with_tools = llm.bind_tools([...])
# ... build your graph