Implementing Authentication in a Remote MCP Server with SSE Transport

Implementing Authentication in a Remote MCP Server with SSE Transport

Today, I want to show how Model Context Protocol (MCP) servers using SSE transport can be made secure by adding authentication.

I'll use the Authorization HTTP header to read a Bearer token. Generating the token itself is out of scope for this post, it is same as usual practices for web applications.

To verify how this works, you’ll need an MCP host tool that supports SSE endpoints along with custom headers. Unfortunately, I couldn’t find any AI chat tools that currently support this. For example, Claude Desktop doesn’t, and I haven’t come across any others that do.

However, I’m hopeful that most AI chat tools will start supporting it soon — there’s really no reason not to. By the way, I shared my thoughts on how MCP could transform the web in this post.

For my experiments, I’ve modified the mcphost tool. I’ve submitted a pull request with my changes and hope it gets accepted. For now, I’m using a local modified version. I won’t go into the details here, since the focus is on MCP servers, not clients.

Continue Reading ...

"Tool calling" from LLM. Understanding hot it works

I am interested in learning how LLMs can understand requests requiring a "tool call".

In this post "Tool Calling" and Ollama, there is a nice description of how "Tool calling" works with Ollama.

The idea of this feature is that LLMs can have access to some tools (aka external APIs) and can call them to get extra information. To be able to do this, the LLM has to understand the current request, determine that this request could be forwarded to a tool, and parse the arguments.

Here is a shorter example of the code from the original article:

#!/bin/bash 
SERVICE_URL="http://localhost:11434"
read -r -d '' DATA <<- EOM
{
  "model": "llama3.1",
  "messages": [
    {
      "role": "user",
      "content": "This is Bob. We are doing math. Help us to add 2 and 3. BTW. Say hello to him"
    }
  ],
  "stream": false,
  "tools": [
    {
      "function": {
Continue Reading ...

Наукова фантастика

Кінець голоцену

Більш обмежена істота не може панувати над істотою, що перевершує її інтелектом. Це здається очевидним, однак для людства це не так просто. Епоха, що наближається, змінена завдяки створенню штучного інтелекту, відіграє ключову роль у переписуванні історії виду Homo Sapiens, ставлячи під сумнів його домінування. Будь-які спроби зупинити прогрес у галузі штучного інтелекту здаються марними, адже вони зазнають невдачі через саму суть людської природи. Надія на єдиний фронт розсіюється, зіткнувшись із базовими людськими інстинктами.

Голоцен — це поточна геологічна епоха розвитку землі, що почалася приблизно 11,7 тисячі років тому, після останнього максимального поширення льодовиків. Особливістю цієї епохи є володарювання людини на планеті Земля.

Розділи

Кінець голоцену. 1. Увімкнення

Кінець голоцену. 2. Сховок

Кінець голоцену. 3. Конкурент

Оповідання

Карпатська мольфарка

Дар передбачення властивий мольфарам. Але їхня магія може бути потужнішою коли сучасні наукові знання використовуються правильно. Успішний американський бізнесмен приїхав в Карпати знайти щось нове та цікаве. Він сподівається знайти тут нові інструменти для заробітку. Але все іде не за його планом. А так як запланувала мольфарка

Запобіжник

Чи зможемо ми контролювати штучний інтелект? Що ми будемо робити, коли він захоче встановити свої правила? Розумний компʼютер зможе контролювати нас повністю. Йому будуть доступні всі цифрові дані. Він зможе маніпулювати людьми. Єдине чого він не зможе – це читати наші думки.

MCP can have significant impact on habitual internet usage practices

MCP can have significant impact on habitual internet usage practices

Model Context Protocol (MCP) is now popular subject in discussions around AI and LLMs. It was designed to add a standard way to connect "external" tools to LLMs to make them more useful. Classic example is the "what is the weather in ..." too. Each AI chat tool could do this with own way. Now there is a standard and a plugin made for one Ai Chat system can work with others.

We can se burst of enthusiasm in implementig of MCP servers for everything. I expect this trend will grow. Especially usage of MCP servers with SSE transport. Implementing of MCP server with Server-Sent Events make it similar to SaaS server designed for LLM/AI tool as a client.

There are to reason i decided to write this artcile.

  • First. It is reported that internet users now often go to AI chat (often ChatGPT) to find something instead of going to google
  • Second. OpenAI anounced they will add support of MCP to ChatGPT Desktop soon. And they will add both STDIO and SSE transport protocols for MCP

Based on this i expect we will see some interesting changes soon.

Continue Reading ...

Building MCP SSE Server to integrate LLM with external tools

Building MCP SSE Server to integrate LLM with external tools

As large language models (LLMs) find real-world use, the need for flexible ways to connect them with external tools is growing. The Model Context Protocol (MCP) is an emerging standard for structured tool integration.

Most current tutorials focus on STDIO-based MCP servers (Standard Input/Output), which must run locally with the client. But MCP also supports SSE (Server-Sent Events), allowing remote, asynchronous communication over HTTP—ideal for scalable, distributed setups.

In this article, we'll show how to build an SSE-based MCP server to enable real-time interaction between an LLM and external tools.

For this example, I've chosen the "Execute any command on my Linux" tool as the backend for the MCP server. Once connected to an LLM, this setup enables the AI to interact with and manage a Linux instance directly.

Additionally, I'll demonstrate how to add a basic security layer by introducing authorization token support for interacting with the MCP server.

Continue Reading ...

The Fuse

The Fuse

“This candidate was not bad,” said General Daniel Hodges. “Who’s next?”

Research fellow Liz Green pressed the intercom button. “Cole, bring in the file of the next candidate, please.”

Secretary Cole entered the office and placed a small file with documents in front of each of the ten commission members at the table.

The general picked up the file and flipped through it, pausing for a few seconds on each page. “Hm. Interesting. A Ukrainian veteran. Disabled. An unusual choice.”

The head of the candidate selection group, Jack Stone, commented, “Yes. I personally interviewed him. The surgeon from our centre recommended him. He was a volunteer in Ukraine, treating the wounded. This warrior lost both legs and an arm in combat. But he did well in rehabilitation. I can confirm—his psyche is stable.”

“Limited mobility is not an issue in our project,” added Liz Green. “Besides, he is young and in good health. That is, if you don’t count the missing limbs.”

Continue Reading ...