At DX-Solutions, we didn't jump into AI yesterday. Our first AI event was back in 2019. Back then, generative AI was still in its infancy, but we already saw its enormous potential for businesses. In 2023 and 2024, we organized AI hackathons again, exploring with customers and partners how AI can deliver real added value to business processes. Today, we're building on this foundation. We continue to brainstorm, test, and experiment with our team to understand the technical and strategic needs for building sustainable, reliable, and affordable AI solutions. One of the most interesting experiments we've conducted recently revolves around comparing small models (1.5B) with large models (70B).

 

What does 1.5B or 70B actually mean?

 

The number after an AI model, such as 1.5B or 70B , refers to the number of parameters in the model.

 

  • More parameters = more knowledge and reasoning power , but also much more memory and energy consumption .

  • A 1.5B model is small and light, ideal for simple applications or embedded solutions.

  • A 70B model is many times more powerful, can handle more complex reasoning, but also requires heavy hardware and a lot of energy.

 

 

This might sound simple: bigger is better , right? But in practice, it's more nuanced.

 

The big differences in practice

 

When we performed the same task on a 1.5B model and a 70B model, the differences were clear. The larger model was more accurate, more complete, and could identify more complex correlations. However, this was offset by:

 

  • Exponentially higher costs to host

  • Much more powerful hardware (think GPUs like H100s that consume up to 700W per card)

  • More latency (longer response time)

 

So a 70B model is more powerful, but not always the most sustainable or cost-efficient choice.

 

Sustainable & reliable AI goes beyond just a large model

 

Behind every smart AI assistant is not just a model, but an ecosystem of data, hardware, energy and optimization .

If we truly want to build sustainable and reliable AI, we need to look beyond just the output.

 

1. The model itself

 

Larger models (70B+) are more powerful, but require exponentially more memory and energy . Smaller models (1.5B–7B) are cheaper and lighter, but require better data and optimization to perform reliably.

A hybrid approach is gaining ground here: small models for 80% of the tasks, large models only for the more complex exceptions.

 

2. Data is the fuel

 

It's not about more data, it's about better data.

  • Curation and cleaning are essential

  • Fine-tuning makes a model smarter in a specific context

  • Distillation can transfer knowledge from a large model to a smaller model

 

This means that a small model can still deliver performance that is surprisingly close to that of a large model, but with much lower energy consumption.

 

3. Hardware and energy consumption

 

  • An L40S GPU consumes 300–400W per card

  • An H100 GPU is going towards 700W

 

If you run a 70B model in production, you are quickly talking about thousands of euros per month in hosting costs and energy consumption.

Smarter choices like quantization (reducing model weights without much loss of quality) and distillation can halve energy consumption – without any significant impact on the quality of the answers.

 

 

4. Optimization = sustainability

 

You don't always have to run the largest model. Often, a smaller model with:

 

  • Smarter prompting

  • RAG (Retrieval-Augmented Generation) to add external knowledge sources

  • Fine-tuning your own domain data

These optimizations allow you to make small models look smart without the expensive and bulky nature of the largest models.

 

 

5. Reliability doesn't come naturally

 

AI should not only generate answers, but also explain why it says something.

  • Monitoring and feedback loops are crucial

  • Quality checks make the difference between a gimmick and a valuable tool

  • AI must perform repeatably and predictably to be reliable in business processes

 

 

The future of AI: smarter, more conscious and more sustainable

 

So the future of AI is not simply “bigger is better” , but smarter, more conscious and more sustainable .

By consciously choosing the right balance between model size, data, hardware and optimization , we build AI that is not only smart, but also energy-efficient, scalable and future-proof .

 

How we approach this at DX-Solutions

 

At DX-Solutions, we help companies make informed choices. Together, we determine which AI strategy and model best suits their needs. This way, we build solutions that:

 

✅ Be smarter with the right data and optimization

✅ Be affordable and scalable in production

✅ Be sustainable in energy consumption and hardware costs

✅ Remain reliable through monitoring and feedback

AI-agents: always smart, always available

Meet Milo, Pacey, Robert and our other AI agents. Specially tailored to SMEs, for specific tasks in your company. Financial or legal assistance, inventory optimization or even improving the health of your employees: it's all possible.

Discover our AI agents