Data protection plays a crucial role in the era of AI – especially when sensitive company data is involved. Many companies are reluctant to use cloud-based AI services such as ChatGPT because confidential information leaves the company.
According to a Cisco study, three out of four German companies restrict the input of data into generative AI tools, and a good third have even completely banned their use internally. Nevertheless, strict bans cannot completely stop their use: Around half of employees worldwide have already entered sensitive company data into AI systems. A survey by Microsoft even revealed that 71% of employees use generative AI tools on their own without official authorisation from their employer. This unofficial ‘shadow AI’ harbours risks, as company data can be transferred to external cloud services in an uncontrolled manner.
Fortunately, there are now alternatives to cloud AI. Increasingly powerful open source models make it possible to operate AI functions locally within the company – and thus keep the data under your own control. For example, large language models can now be run in a scaled-down form on in-house computers. One prominent example is the Chinese model DeepSeek R1, which was recently published as an open reasoning model and has attracted a great deal of media interest. Such local AI models can be used, for example, to summarise texts, detect errors in data or analyse reports – without the information leaving the company. In the next step, we will take a look at how the LMStudio tool under macOS can be used to do just that.
What is LMStudio?
LMStudio is a free application that makes it easy to run large AI language models on your own computer – even on Macs with Apple Silicon. In contrast to purely developer-oriented solutions, LMStudio requires no programming knowledgeand no access to the terminal. The software offers a user-friendly interface that combines the discovery, download and use of AI models in one tool, taking into account the existing hardware. All common open source models are available, including Llama, DeepSeek, Qwen (to which QWQ belongs) and Mistral. For companies, this means that they can use modern AI models themselves without having to rely on cloud services. The data remains completely local – LMStudio itself does not collect any user data, all input and analyses remain private on the company’s own system. This approach avoids many data protection issues, as no sensitive information is transferred to third parties.
Powerful local models: The DeepSeek R1 and QwQ models are particularly interesting for corporate use. DeepSeek R1 is a highly developed ‘reasoning’ model that can solve complex tasks step by step. The full version of DeepSeek R1 is huge with 671 billion parameters, but thanks to distillation, significantly smaller variants (around 7 trillion parameters) are also available – large enough for intelligent answers, small enough for local operation. QwQ, on the other hand, is a new 32 trillion model from the Alibaba Qwen project, which is amazingly powerful thanks to sophisticated training methods. In tests, QwQ is even said to be able to keep up with much larger AI systems such as DeepSeek R1 (671B), especially when it comes to logical conclusions and complex analyses. These two models demonstrate the potential of local AI: they offer ChatGPT-like capabilities but run on the company’s own hardware.
Step-by-step guide
Here is an easy-to-follow guide to setting up LMStudio on macOS and using local AI models – DeepSeek and QWQ in particular – to analyse your data.
- Installing LMStudio on macOS
Firstly, download LMStudio from the official website (there is a suitable installer for macOS). Open the downloaded.dmg
file and drag the LMStudio app into your Applications folder, as is usual with macOS apps. Then start the application. (Note: LMStudio runs on Silicon Macs, older Intel Macs are currently not supported). - Download DeepSeek model
In LMStudio you can now choose from many models. Click on the model search icon on the left side of the app. Search the catalogue for “DeepSeek ’ – you should see entries for DeepSeek R1. Select a variant that matches your hardware. For example, for a Mac with 16 GB of RAM, the 7B or 14B distill version of DeepSeek R1 is recommended, as the full version with 671B parameters would require around 192 GB of RAM. Click on Download and wait until the model has been downloaded and installed (the file size can be between a few GB and dozens of GB, depending on the version). - DeepSeek in use
Switch back to the LMStudio chat view. Now select the loaded DeepSeek model as the active AI model (if not done automatically). Now you can enter company data into the model – for example, the text of an internal report or log – and request a summary. For example, type: “Summarise the following report in a few sentences: ’ followed by the text of the report. The DeepSeek model analyses the text and provides a compact summary of the most important points without the text ever leaving your device. This allows you to quickly reduce longer documents or emails to their core statements. A special feature: up to five files such as PDF, DOCX, TXTor CSV can also be inserted - Download QwQ model
Next, you can try out the QwQ model to carry out more complex analyses. Open the model catalogue in LMStudio again and search for “QwQ ’ (sometimes it is listed as part of the Qwen model family, e.g. Qwen QwQ 32B). Select the QwQ 32B model (or an available quantised version that runs on your Mac) and download it. Note that QwQ is slightly larger with 32 trillion parameters, so make sure you have enough RAM (at least 32 GB is recommended for the 32B model in full precision – if you have less RAM, choose a quantised version with e.g. 4-bit, which requires significantly less memory). - QwQ in use (example error analysis and data evaluation) After loading QwQ, switch back to the chat interface and activate the QwQ model. This model is particularly good at logical thinking and can help you to find patterns or errors in your data. For example, enter a series of error messages or log entries from your IT department and ask: ‘What could be the causes of these errors?’ QwQ will go through the messages and deduce possible correlations or reasons. You can also entrust QwQ with a list of customer feedback or survey data and ask it to evaluate the data – for example, to recognise key issues or derive suggestions. This gives you valuable insights from unstructured company data that would otherwise be difficult to gain manually.
Application examples at a glance: The range of things you can do with locally running AI models is diverse:
- Create summaries: Long documents, reports or meeting minutes can be summarised by the model into a few concise sentences. This saves time and makes it easier for decision-makers to keep track of large amounts of information.
- Analyse errors: Whether code snippets or system logs – the AI model can suggest possible causes of errors or solutions based on its knowledge. QwQ in particular, with its strong ‘reasoning’ ability, can think through complex cause-and-effect chains and provide hints for problem solving.
- Evaluate and categorise data: Models can help to analyse unstructured data. For example, customer comments can be automatically categorised (positive/negative, subject areas) or risky content can be flagged. AI can also recognise trends in sales reports or survey data and describe them verbally.
All these analyses are performed internally: the data remains within the company and the AI model works offline on your Mac. For you, this means fast analyses without violating data protection guidelines.
Data protection and risks
The use of local AI models offers great advantages in terms of data protection – but you should not blindly trust these models either. Although your data remains on your own computer and LMStudio itself does not transmit anything to the outside world, the output of the AI must still be critically scrutinised. Even open models can give incorrect or ‘hallucinated’ answers. Smaller AI models in particular tend to make things up or provide inaccurate information. Built-in biases or gaps in training knowledge can also play a role. Although a powerful model such as QwQ impresses with its astute analyses, experts warn of potential risks here too – such as biased answers or undesirable content if the system is used incorrectly. The quality of the results should therefore always be validated by a specialist before decisions are made based on the AI conclusions.
In addition, just because the AI is running locally does not mean that every input is harmless. Companies should develop clear guidelines on which internal data may be used for AI analyses and how the results are to be used. Sensitive personal data, for example, is still subject to data protection laws (GDPR) and should only be processed with caution by an AI system – whether locally or in the cloud. The advantage of local models is that you retain control and can stop data processing immediately if necessary because everything runs on your own infrastructure.
Gain experience early on
It is worthwhile for companies to familiarise themselves with this technology now. Development in the field of AI is progressing rapidly – those who experiment early on can build up internal expertise and find out which use cases bring real added value. Introduce local AI on a small scale first, for example in a specialised department as a pilot project. This allows you to safely test how well the models handle your data and recognise any risks before a broad roll-out takes place. Training employees in the use of AI tools is also useful so that staff understand the strengths and limitations of the models. By taking this proactive approach, companies develop a feel for the opportunities of AI without relinquishing control of their data.
Utilising the opportunities of local AI
Local AI models such as DeepSeek and QwQ, powered by tools such as LMStudio, provide a way for organisations to take advantage of modern AI without compromising on data privacy. Internally operated AI can summarise business reports, uncover error patterns or prepare unstructured data in an understandable way – all while keeping confidential information secure within the company’s own IT. Of course, a critical approach is required, but with the right guidelines and a little practice, the risk can be managed. For decision-makers, this is an opportunity to utilise AI innovations in a targeted and compliant manner: You remain independent of cloud providers, can customise AI solutions to your own needs and at the same time strengthen the trust of customers and employees in the responsible handling of data. Local AI is therefore a promising building block for the future of corporate data analysis – data protection-friendly, flexible and full of possibilities.