Once that token was active, I could inject it directly into the system prompt for my chatbot’s AI Agent:function my_get_current_ai_context(): string {
$entity = NULL;
$route_match = Drupal::routeMatch();
// Detect whether we’re in a chatbot API request.
if ($route_match->getRouteName() === 'ai_chatbot.api') {
$request = Drupal::request();
$content = $request->getContent();
$data = Json::decode($content);
// Extract the user’s original route from the chatbot payload.
if (isset($data['contexts']['current_route'])) {
$current_url = $data['contexts']['current_route'];
/** @var DrupalCoreRoutingRouter $router */
$router = Drupal::service('router.no_access_checks');
try {
$parent_route_match_info = $router->match($current_url);
foreach (['node', 'taxonomy_term', 'commerce_product'] as $param_name) {
if (isset($parent_route_match_info[$param_name])) {
$entity = $parent_route_match_info[$param_name];
break;
}
}
}
catch (Exception $e) {}
}
}
else {
// Fallback: resolve directly from the current route (non-AI contexts).
foreach (['node', 'taxonomy_term', 'commerce_product'] as $param_name) {
if ($route_match->getParameter($param_name)) {
$entity = $route_match->getParameter($param_name);
break;
}
}
}
if ($entity instanceof ContentEntityInterface) {
$context = [
'* URL: ' . $entity->toUrl('canonical', ['absolute' => TRUE])->toString(),
'* Type: ' . $entity->getEntityTypeId(),
'* Title: ' . $entity->label(),
];
return implode("n", $context);
}
return 'No context available.';
}
My first instinct was to solve this purely at the prompt level. I tried telling the AI something like:
That’s not a big deal when you’re dealing with generic Q&A. But it quickly becomes a problem when your chatbot lives inside a context-rich environment like a Drupal shop or product catalog.
That’s not a big deal when you’re dealing with generic Q&A. But it quickly becomes a problem when your chatbot lives inside a context-rich environment like a Drupal shop or product catalog.
Now, when a user on the Austroflamm Ivy fireplace stove page asks:
Final Thoughts
if ($type === 'current-page') {
foreach ($tokens as $name => $original) {
switch ($name) {
case 'ai-context':
$replacements[$original] = Markup::create(my_get_current_ai_context());
$bubbleable_metadata->addCacheContexts(['url.path']);
break;
}
}
}
The Critical Part: Detecting the Chatbot Context
## Current Page Context
**The current page context is critical.**
If the user asks a question while viewing a detail page, the answer should primarily consider this information.
The current page context is provided here:
[current-page:ai-context]
If this context is available, prioritize questions related to it over a general search in the index.
If no context is available, proceed normally without referencing it.
By structuring the prompt this way, the AI explicitly knows to pay attention to the page-specific entity first, rather than giving generic answers or relying on previous conversation history.