AIVEA

CognitEye - Company and Team

Is a deep-tech company with expertise in AI and Assistive Technologies.

⇒  We are a multidisciplinary team of dedicated engineers and scientists.

Dr. Ehsan D. Farahani

Dr. Ehsan D. Farahani

  • Team manager, system design engineer
  • Data Scientist
    Amirhosein Baratai

    Amirhosein Baratai

    • Software engineer
    • Full Stack and Android developer
    Hosna Rohani

    Hosna Rohani

    • AI team leader, AI developer
    • Computer vision specialist

    EXPERIMENT OVERVIEW

    OVERVIEW

    Developing an AI-based voice assistant system for electronic assembly and installation tasks, specifically targeting Social Workplaces.

    1. Conversational AI for Task Execution
    2. step-by-step guidance

    CHALLENGES

    The challenges in social workplaces for assembly process

    • Cognitive and language barriers in assembly processes
    • High error rates and inefficiencies
    • Onboarding of new employees

    OBJECTIVES

    Objective 1: Implement conversational guidance using NLP
    Objective 2: Provide Step-by-Step Task Guidance
    Objective 3: Develop an Open-Source Solution Using OVOS

    EXPERIMENT IMPACT

    EXPECTED RESULTS (KPIs)

    • Ensure 95% accuracy in voice command recognition and response during testing.
    • Task completion time reduction of new employees – target: 20% faster assembly times compared to traditional method
    • Percentage of worker satisfaction and inclusion – target: 20% improvement in satisfaction of workers.

    VAFER

    EXPERIMENT OVERVIEW

    The VAFER experiment will deploy an innovative onboarding and training support system for high-tech manufacturing, leveraging Artificial Intelligence (AI) and the Open Voice OS (OVOS) framework. The experiment involves R2M Solution, the Digital Innovation Hub Lombardia (DIHL), and the T&G electronic repair centre for real-world validation.

    The solution operates in two main phases to improve human-machine collaboration. First, it conducts a precise skill assessment by combining automatic CV analysis via R2M’s AVANTI software with a natural, voice-driven AI interview powered by OVOS to identify exact trainee competencies and eliminate redundant training. Second, it provides a Digital Intelligent Assistant (DIA) for real-time, interactive voice assistance during practical training sessions. This will be demonstrated by allowing trainees to receive step-by-step guidance, instructions, checklists and risk explanations through voice interaction while keeping their hands and eyes focused on electronic boards. This experiment is highly relevant to the WASABI ecosystem because it combines Large Language Models (LLMs) and OVOS voice framework in AI agents that considerabily improve operational efficiency in real cases.

    The primary challenge that VAFER addressed is the expensive and time-consuming nature of the onboarding process, especially in high-tech manufacturing and electronic repair. Currently, new hires need to read extensive operational procedures, study complex equipment manuals, and spend considerable time training on a physical workbench. This scenario has many pain points: companies view onboarding as a financial burden and productivity drain (although necessary), while trainees may experience high mental load and frustration from redundant skill assessments and the pressure of memorizing lengthy checklists. The second use case, more operational and technical, is highly sensitive in a context like T&G. Integrating new employees disrupts established workflows, and errors made during delicate operations—such as repairing Printed Circuit Boards (PCBs)—can severely damage equipment or compromise safety protocols. A standard chatbot could help, but writing questions with a keyboard obliges to look away, reduces attention, and can ultimately lead to mistakes.

    Objective 1: 

    Reduce average onboarding time by up to 15% compared to the current baseline.

    • This will be achieved by utilizing the AVANTI solution and the OVOS-powered Digital Intelligent Assistant to assess existing skills and allow mentors to remove redundant training modules

    Objective 2:

    decrease average rate of errors during practical training sessions by up to 25%.

    • The OVOS AI agent will provide on-demand, hands-free guidance, for example reminding correct sequence of actions and explaining potential risks. Operators will then receive support while keeping visual and manual focus on workbench tasks. To further reduce risks of damages to PCBs, we will also provide a VR-powered, virtual workbench where new hires will exercise in a no-stress, zero-risk environment.

    Objective 3: 

    improve trainee onboarding satisfaction by up to 20% and enhance training retention by up to 15%. 

    • The preliminary interaction will lead to a reduction or removal of topics where trainees are already proficient, reducing frustration.

     

    We will carry out the experiments in the electronic manufacturing and repair sector, specifically focusing on the onboarding and on-the-job support of personnel. To validate the use case, we will rely on our pilot T&G Repair and Remanufacturing Centre in Italy, a facility that specializes in extending the lifecycle of obsolete electronic components and Printed Circuit Boards (PCBs), processing around 13,000 repairs annually. Target users include new hires undergoing training (first scenario), existing employees requiring reskilling (second scenario), and HR professionals or mentors overseeing the process (both).

    Key constraints include managing the varying quality of input data (such as unstructured CVs and dense company manuals), ensuring stringent data privacy and security, and safely integrating the system within an environment that features complex procedures, specialized machinery, and strict safety protocols.

    EXPECTED IMPACT

    EXPECTED IMPACT

    The success of the VAFER experiment will be measured through concrete Key Performance Indicators (KPIs) tracked against the company’s current baselines. The measurable targets include a 15% reduction in onboarding time, a 20% improvement in user onboarding satisfaction (measured via pre- and post-assessment surveys), a 25% reduction in errors during practical sessions, and a 15% increase in training retention (measured via mentor assessments).

    End users will benefit from a personalized, engaging training experience that drastically reduces mental burden, while stakeholders will see reduced training costs and smoother workflow integration. Environmentally, by successfully training operators to repair and remanufacture obsolete electronic components rather than discarding them, the solution directly supports sustainability and circular economy principles in the electronics lifecycle.

    AI-MODE

    SYLTEC - Company

    SYLTEC is a company that provides comprehensive engineering and technology consulting services, delivering high value-added solutions tailored to the needs of our clients.

    We develop disruptive solutions that integrate advanced technology to transform key sectors such as industry, healthcare, and tourism & cultural heritage. Our mission is to generate real impact through the application of artificial intelligence, extended reality, and process automation, delivering tools that not only innovate but also provide direct value

    In the industrial field, we focus on process optimization, intelligent automation, with the aim of reducing execution times, which translates into cost savings and increased productivity.
    In healthcare, we apply cutting-edge technologies for the early detection of diseases and the development of intelligent virtual assistants.
    In tourism and cultural heritage, we create immersive and interactive experiences that bring knowledge closer to diverse audience.

    SYLTEC is part of the Digital Innovation Hub DIHBU, with which it actively collaborates in digital transformation and artificial intelligence initiatives. One example is the AI-MODE project, where together with DIHBU we are developing a generative AI-based virtual assistant to optimize industrial operations.

    Its main role will be to provide the testing environment (PROHIMA’s factory in Barcelona), where the solution will be tested. It will coordinate the testing and verify whether it meets the requirements for prototype validation. In addition, DIHBU will disseminate the project’s progress and results, thus supporting its scalability and adoption by new companies within its ecosystem.

    PROHIMA - Company

    PROHIMA is a manufacturing company specialized in the packaging of wipes and single-dose products for sectors such as cosmetics, perfumery, pharmacy, and parapharmacy. With more than 40 years of experience, it offers individual packaging solutions for liquids, gels, and creams, ensuring quality, innovation, and confidentiality in every project.

    Within the framework of the project, PROHIMA’s factory in Barcelona will be the site for the prototype implementation and will serve as the industrial demonstrator, validating the solution in a real production environment.

    EXPERIMENT OVERVIEW

    OVERVIEW

    SYLTEC, in collaboration with DIHBU, is developing an intelligent virtual assistant tailored for the manufacturing industry.
    The system focuses on two key areas: agile onboarding of new operators and incident resolution.
    It leverages open-source Large Language Models (Llama3, Qwen3, Gemma3…) combined with Retrieval Augmented Generation (RAG) to provide contextual and accurate guidance.
    The assistant will be integrated into the OVOS platform, deployed through Docker containers, and accessed via a lightweight touch-based interface supporting both voice and text interaction.
    The outcome will be a validated prototype in an industrial setting, demonstrating the efficiency of AI-driven assistance in reducing training time and supporting complex processes.

    CHALLENGES

    1. Complex onboarding processes in manufacturing environments, requiring significant time and supervision.
    2. Limited AI tools for SMEs that deliver contextualized and reliable support for incident resolution.
    3. Knowledge fragmentation, as operational information is dispersed across manuals, documents, and informal communication.
    4. Integration barriers, since combining LLMs with industrial data and workflows can create technical and usability challenges.
    5. User adoption risks, requiring an interface that is simple, intuitive, and adapted to industrial devices.

    OBJECTIVES

    1. Develop an AI-based assistant that supports onboarding and incident resolution in manufacturing.
    2. Provide enriched interaction (voice and text with visual diagrams).
    3. Design a custom, lightweight interface optimized for industrial touch devices.
    4. Ensure integration with existing platforms (OVOS, Docker) to guarantee scalability and replicability.
    5. Validate the system in a real industrial environment, achieving measurable improvements in accuracy, efficiency, and user satisfaction.

     

    EXPERIMENT IMPACT

    EXPECTED RESULTS (KPIs)

    1. ≥80% accuracy in AI-generated answers (relevance, correctness, context).
    2. ≤15% hallucination rate, ensuring reliability in industrial use cases.≥25% reduction in onboarding and incident resolution time, improving productivity.
    3. Development of a functional prototype reaching TRL 6–7, validated in a real manufacturing plant.
    4. Demonstrated replicability and scalability of the assistant for other SMEs and industrial contexts.

     

    HumanEnerDIA

    Company and Team

    A Plus Engineering
    • Company Manager Domain Expert
    • Technical Manager
    • Team Leader
    • IoT & Integrations
    • Front End
    • AI & Data Science
    • Back-End

    EXPERIMENT OVERVIEW

    OVERVIEW

    Development of voice and text based digital intelligent assistant to be integrated to an energy management system software to increase human interaction and well-being while increasing energy performance of industrial facilities.

    CHALLENGES

    Managing EnMS across a factory is challenging, as it involves almost all processes, including production, maintenance, HR, procurement, planning, Q&A, and R&D. This complexity demands significant human expertise, communication, coordination, and effort.

    Another challenge is interpreting the vast data from various sources (e.g., raw materials, production, storage, weather) alongside energy data.

    OBJECTIVES

    enhance energy engagement to facilitate operations and guide users to monitor the efficiency and take action accordingly with the implementation of digital intelligent assistant elements

    EXPERIMENT IMPACT

    EXPECTED RESULTS (KPIs)

    • 30% reduction in overall energy management efforts of operational users including understanding the standards,    requirement, documentation, policy and procedures
    •  Reduction in technical intervention of system users to monitor, analyse and report energy efficiency for better energy   management tasks by 25%
    •  Successful integration of Intel50001 into the WASABI technology platform with DIA implementation of at least 3   different modules including monitoring, analyses and documentation.

    Velo

    CEESA - Company and Team

    Founded in 1986 and located in Getxo (Biscay)
    Focused on the development of Business Management Software

    Mission: To help business optimize their processes and leverage corporate knowledge for effective decision-making.

    Jose Luís Cuesta

    Jose Luís Cuesta

    CEO

    Leads and coordinates the project

    Ruben Rabadan

    Ruben Rabadan

    developer

    Will lead the implementation of the project

    EXPERIMENT OVERVIEW

    OVERVIEW

    Integrate a voice-based assistant into CEESA’s ERP systems

    Leverage NLP, speech recognition and AI to enable intuitive voice commands and intelligent decision-making across ERP modules

    CHALLENGES

    • Integrating voice-based technology within complex ERP architectures
    • Ensure seamless and secure integration
    • Maintain data privacy, system security and interoperability with existing systems

    OBJECTIVES

    • Streamline ERP processes using voice-based interaction (Reduce manual data)

    • Optimize specific ERP workflows.

    • Enhance data accessibility and user experience.

    • Improve supply chain visibility.

    • Reduce time and costs for common ERP tasks.

    • Support efficient data management and decision-making within the ERP.

       

    EXPERIMENT IMPACT

    EXPECTED RESULTS (KPIs)

    • Reduce manual data entry time in ERP.
    • Increase efficiency of key ERP workflows.
    • Improve ERP data accuracy.
    • Enhance real-time data accessibility.
    • Reduce time for common ERP tasks.
    • Optimize ERP-driven processes.
    • Increase user satisfaction with ERPs.