AI on Pure Contrast Tools: Servant, Not Master

13 February 2026

People are right to be worried about AI. It can threaten privacy, when every prompt and click is logged and fed into systems that learn from our behaviour. It has a real environmental cost, with data centres consuming huge amounts of energy and water to train and run models. And it has been used to scrape books, art, and the work of creators without consent, turning their intellectual property into training data. Those concerns are valid. They deserve serious answers, not marketing slogans.

AI is here to stay, though. The question is not whether we have it, but whether we make it a servant or a master. Society has to choose.

The Upside: Using Information, Not Just Hoarding It

AI can be genuinely useful. It can surface in seconds what might have taken hours to find by hand: a fact, a comparison, a summary, a translation. In the 1990s we were given the information revolution. The world's knowledge was put at our fingertips. The problem was that we had more information than we could ever use. Now AI offers a way to use that information efficiently. That is a real gain, if we keep it in the right hands.

The risk is that the gains are captured by a few. If the data that powers AI stays in private hands and the benefits flow only to shareholders, we will not get a post-scarcity world. We will get something closer to feudalism: a small class that owns the machines, and everyone else dependent on them. There are ways for ordinary people to push back. We can prefer tools that do not train on our data. We can support transparency about environmental impact. We can demand that AI serves the many, not the few. The future does not have to be Mad Max. It could be closer to Star Trek: technology that expands what humans can do, instead of replacing them in the service of profit.

How Pure Contrast Tools Uses AI

On this site we use AI in a few places: scam detection in Pure Mail, the Scots translator (Pure Patter), the oracle, and similar features. We do it in a way that respects the concerns above.

Our AI Principles

  • Your IP is never sent to the AI. Our backend talks to the AI provider. Your browser never does. Your IP address and identity stay on our server; the model only sees the content you choose to submit (e.g. the text of an email or a phrase to translate).
  • You decide what you share. We only send what is needed for the feature. You are not logged in; we do not build a profile. You choose what goes into the request.
  • Your data is not used for training. We use Mistral AI's API. Their terms state clearly that they do not use your prompts or outputs to train their models. API data is not fed back into the system. That is a condition we require.
  • We chose an environmentally transparent provider. Mistral has published the first comprehensive lifecycle analysis of a large language model, in partnership with Carbone 4 and France's ecological transition agency (ADEME). They report training and per-query impacts (carbon, water, resource use), and they are pushing for industry-wide standards so that buyers and users can compare models. They joined the Coalition for Sustainable AI at the Paris AI Action Summit to advance that. Per query, the marginal impact of a typical response is in the same ballpark as a few seconds of video streaming. We are not saying AI has no footprint; we are saying that transparency and efficiency matter, and we chose a provider that takes that seriously.

Choosing AI That Serves Us

As a society we have to choose AI that benefits people, not only shareholders. That means being vigilant. It means not simply opting for the easiest or shiniest option. It means asking who sees our data, whether it is used for training, and what the environmental and social costs are. At Pure Contrast Tools we try to answer those questions in advance: privacy by design, no training on user data, and a provider that reports its footprint and advocates for global standards. We want AI to be a servant, not a master. That starts with the choices we make today.