Off-Cloud AI Appliance

Silent AI

100% local. 100% secure. 100% yours.

Challenges

for the use of a GenAI (GPT) in the company

AI in the company

AI can search multiple data sources for desired information and summarize the results in a comprehensible answer.

Summaries of detailed content, translations and reformulations are also among the strengths of generative AI.

The use of Generative AI (GenAI) such as ChatGPT undoubtedly has a lot of potential for companies and institutions.

Unlike machine learning, which has been used for years for predictive maintenance and similar applications, GenAI is relatively new. Large language models (LLMs) allow complex questions to be answered “humanely” instead of just providing a list of search results.

Questions replace searches, answers replace search results.

Companies and institutions hope that this will above all increase efficiency when answering the same questions over and over again, e.g. in first level support.

However, when planning the use of a GenAI, there are a number of problems associated with public GenAI solutions: The most pressing is the issue of privacy and data security, as with cloud solutions the data used to train the LLM is used to enrich its “knowledge”. Hallucination and non-transparent pricing are also obstacles to the use of publicly available GPT solutions.

It's Your Data

Silent AI.

100% local. 100% secure. 100% yours.

Silent AI is an AI appliance that combines all components for self-sufficient AI.

The platform consists of a state-of-the-art architecture that has been optimized for maximum data throughput between storage and processors.

The storage is based on our highly secure and proven storage systems and offers maximum security against data loss and cyber attacks.

The software is based on hardened Linux. Several vector databases with integrated rights management contain the information that was previously created from your source data using our parsers and connections. The language model (LLM) used is pre-trained for your requirements and is completely self-sufficient. The API allows the integration of Silent AI functionality into other applications.

The FAST LTA CARE service takes care of the smooth and secure operation of the system, and we are available 24/7 on request. Consistent costs in the long term ensure planning security.

Fakecontroller08 | FAST LTA

Local.

Silent AI runs entirely in a local environment and works without any online connection. This means you always retain complete control over your data, which you can also remove at any time, even partially.

Private.

Silent AI is integrated into your rights and user management system. Access to the data used is based on the rights of the users. Your data never leaves the local environment.

Secure.

Silent AI is based on our decades of experience with highly secure storage systems, where data must never be lost or compromised through misuse.

Managed.

Silent AI will be available as a turnkey appliance for AI-supported enterprise knowledge management, e.g. to support sales or customer service. Data from various sources can be integrated.

Sustainable.

As no training of the LLM takes place in Silent AI, the system requires far fewer GPU resources than comparable systems. This saves energy and ensures a low CO2 footprint.

From Europe.

Silent AI is developed in Germany and is compliant with GDPR and European AI legislation. You have access to local support and are looked after personally.

No re-training.

Local language models with RAG enable completely self-sufficient AI.

Local LLM

It is no longer just commercial “black box” language models that are available. The open source community has produced a large number of freely available LLMs that do not have to hide behind ChatGPT and others in terms of performance.

These models can be used locally and therefore completely without any Internet access.

Instead of having to use very large, universal language models that require enormous resources for training and deployment, these LLMs can be pre-trained for specific tasks and thus kept lean.

RAG injection

In order to get this local language model to take private data sources into account, these are transferred using RAG (Retrieval Augmented Generation) at the time of the query with the prompt. The LLM obtains the information from a vector database that was generated in advance from this source data.

If access to this database is removed for the LLM, results that can only come from this private information can no longer be retrieved.

The use of a lean LLM and RAG injection makes it possible to separate the language (understanding and output) from the knowledge (information sources).

Transparent use

Providers of public GenAI solutions will have to integrate various advertising models into their services to cover the immense costs, as Perplexity has just announced. The basis for this is, of course, the evaluation of the queries entered and the resulting answers.

Even after anonymization, there is a risk that company information will be stored and used by GenAI providers based on the questions.

Silent AI naturally refrains from any kind of evaluation of the queries and answers. Users can optionally rate the quality of the answer after a query and make these ratings available to FAST LTA.

Privacy and data protection

ChatGPT in the company?

SCR 20240731 ld7 | FAST LTA

A recent study of companies in the USA shows that security and privacy are the biggest challenges when using generative AI in companies.

Source: sas.com, March 2024

We asked Perplexity:

Why is it not a good idea to upload sensitive company data to ChatGPT?

In summary, the use of ChatGPT for sensitive company data poses significant risks in terms of both data security and compliance with data protection regulations. It is therefore advisable to be extremely careful when entering such data and to consider alternative, more secure solutions.

VIEW FULL ANSWER AT PERPLEXITY

Applications

Silent AI is suitable for a range of text-based applications where private data must not leave the local site.

Intelligent Knowledge Management

Knowledge management is one of the biggest challenges facing companies and public authorities. The ability to quickly obtain correct answers to recurring questions increases efficiency and reduces training time.

Despite advancing digitalization, information is often distributed across multiple sources and systems. In most cases, information is obtained via several search queries in the respective platform. As a result, employees receive more or less suitable search hits, from which the relevant information must then be laboriously searched for.

Silent AI transforms the information retrieval from multiple search queries into a comprehensible answer with precise source information.

This allows employees to quickly find relevant answers from different source systems.

As Silent AI takes existing rights management into account, employees only receive answers to information to which they have access. In addition, entire areas of information can be made temporarily or permanently inaccessible by decoupling databases.

Intelligent Coding Assistant

AI has long been helping with software development. However, current assistance systems are based on public cloud services. It is not always transparent which information and parts of code are stored and used by the provider to improve the AI. In the worst case, code with a high development effort and corresponding value can suddenly be freely available to everyone without the corresponding license conditions being observed.

The use of Silent AI restricts support to the local development environment.

For this purpose, special LLMs can be used that have been optimized for the respective programming languages used. Code already available in the company can also be used for assistance via RAG.

As Silent AI takes existing rights management into account, employees only receive answers to information to which they have access. In addition, entire areas of information can be made temporarily or permanently inaccessible by decoupling databases.

AI integration

Silent AI can also be integrated into existing software. Replace the search field of your solution with a question field. Your users receive answers with information directly from your software, with high relevance and precise source information.

We offer a comprehensive API to integrate Silent AI into your application.

Whether DMS, financial application, documentation, ticket system or your specific industry solution: local AI integration with Silent AI improves the user experience and increases productivity.

Silent AI is an AI appliance that is installed in your or an edge data center. You have complete control over which connections Silent AI is allowed to establish, which data is processed and who has access to the appliance. Silent AI does not require a connection to any cloud service (except for optional monitoring of the system itself).

Silent AI is based on a special storage platform that benefits from our many years of experience in the field of secure storage systems. In addition to high internal data throughput, the focus during development was on data security and data privacy.

Silent AI is connected directly to your data sources and does not require an “upload” of documents. Our parsers can handle various text-based sources, such as Office365 & Sharepoint, Confluence / Jira, websites / intranet. You determine how often the information is updated from the respective sources.

Silent AI is designed for up to 50 users per unit and application. As a local appliance, Silent AI respects your existing rights management (e.g. AD) and can release and restrict functions and access to databases accordingly.

Silent AI is not based on token-based billing that “penalizes” you the more you use the system. In addition to the hardware, constant license and maintenance fees ensure low costs that can be planned for in the long term.

Most AI appliances aim to build their own AI model (usually machine learning, more rarely GenAI / LLM) and train it for the specific use case. This process is lengthy, requires extremely high GPU computing power and is therefore very energy- and cost-intensive.

Silent AI combines a general, lean open source language model with targeted RAG injection from local vector databases. No training of an AI model is necessary for this. Creating and retrieving the vector databases requires a fraction of the GPU computing power of other AI appliances.

Machine learning has been used for decades to obtain high-quality forecasts for specific areas from as much historical data as possible. Predictive maintenance, financial forecasts and weather reports are just some of the common applications.

Generative AI combines information from different sources and can be questioned in understandable language and provide answers, but can also generate images, sounds, videos, create summaries, recognize patterns (image analysis) and write analyses. Predictions are not part of GenAI's repertoire.

No, Silent AI is a text-based GenAI. It can be asked questions in understandable language and write answers based on the additional information provided by RAG at the time of the query. Silent AI also has the capabilities of the respective language model, i.e. it can summarize, reformulate and translate texts.

Have we sparked your interest?

We would be happy to present Silent AI to you in a personal meeting and evaluate possible application scenarios with you. We are certain that your company can also benefit from the use of local generative AI.

GET IN TOUCH

AI from FAST LTA?

FAST LTA is known as a specialist for secure and reliable storage systems. Why is FAST LTA now developing an AI Alliance and entering the application market?

Secure & private.

We see a problem with current AI offerings. Cloud-based services such as ChatGPT are based on the idea that the language model is constantly improving. To do this, data and information used when using the services must be used to improve the LLM.

This cannot be in the interests of our customers, whose main concern is to protect their sensitive data from unauthorized access and misuse.

That is why, with Silent AI, we are creating an AI solution that is 100% local and 100% secure. This continues our commitment to taking responsibility for our customers' data. Our commitment to preventing data loss goes far beyond the usual efforts of IT providers. We also want to be able to offer this in what is currently the fastest-growing sector.

Reliable & durable.

Based on the principle of data security, we develop solutions that are characterized in particular by their reliability and longevity. We continue to do this in our platform development for Silent AI.

As the use of AI in companies is all about productivity and efficiency, the reliability of the solution is of crucial importance. That is why we are incorporating our experience from the last few decades into Silent AI.

Sustainable & personal.

We make sure that we always choose the most sustainable solution for a task. We can only do this if we can actually control which components are used and how they are used.

According to expert estimates, ChatGPT has a CO2 footprint of 1.5-2 grams per query (in comparison: a search query on Google “costs” approx. 0.2 grams). Large, public language models are not sustainable due to their massive appetite for resources.

Even if the exact measurement results are still pending, we can already predict with certainty: local, pre-trained LLMs consume much less energy.

In addition, as a German - European - company, we are committed to local sourcing and short distances. You benefit from local contacts, short delivery times and direct contact with personal contacts.