New Trends in Interaction and Experience Design of SaaS in the Age of AI

With the rapid development of AI technologies such as large language models (LLM), agents, and model-context protocols (MCP), enterprise-level SaaS software is undergoing a profound transformation in interaction paradigms. In B-end applications such as manufacturing execution systems (MES), enterprise resource planning (ERP), and customer relationship management (CRM), AI is no longer just a supporter of backend algorithms, but is gradually becoming the core force driving human-computer interaction experiences, reshaping the way software is used and its value.

Introduction

💡 According to Gartner's latest prediction, the proportion of integrated autonomous AI in enterprise software will surge from less than 1% in 2024 to 33% in 2028. Meanwhile, over 15% of daily work decisions will be autonomously completed by AI agents.

AI Integration in Enterprise Software: 2024–2028ALT

So, how should SaaS companies respond when AI becomes the dominant paradigm of interaction?

With the rapid development of AI technologies such as Large Language Models (LLM), Agents, and Model Context Protocols (MCP), enterprise-level SaaS software is undergoing a profound transformation in interaction paradigms. In business applications such as Manufacturing Execution Systems (MES), Enterprise Resource Planning (ERP), and Customer Relationship Management (CRM), AI is no longer just a supporter of backend algorithms; it is gradually becoming the core force that dominates the human-computer interaction experience, reshaping the way software is used and valued.

AI is Reshaping New Interaction Experiences

At the end of 2022, OpenAI released ChatGPT, which ignited the entire internet. “Conversational interaction” has become the preferred design choice for AI products. However, this text dialogue interaction method has similar forms that have existed for a long time, namely CLI (Command Line Interface).

CLI: Professional Interaction Driven by Early Technology

CLI is an interaction form from the early development of modern computers.

Early computers had limited performance and were mainly used for scientific computing and professional fields, with users mostly being technical personnel. At that time, computer systems were text-based and operated through command line interfaces. Users needed to input specific commands to execute various tasks. For example, in UNIX systems, entering the “ls” command displays the directory list, and the “cd” command switches directories.

Characteristics of CLI include:

  • Need to memorize a large number of commands and parameters;

  • Low degree of visualization, not intuitive enough;

  • High technical threshold requirements;

This interface requires users to memorize a large number of commands and their parameters, which has a high technical requirement for users, making it difficult to get started, and the visualization of the interaction process is low and not intuitive or friendly. However, due to its low consumption of system resources, it is still widely used in development and operations scenarios.

BlueOS CLI: Developers use command line to manage plugins and launch optionsALT

GUI: Graphical User Interface Brings Computers to the Public

From the late 1970s to the early 1980s, led by Xerox, Apple, and Microsoft, GUI (Graphical User Interface) gradually developed and expanded. GUI uses graphical elements such as windows, icons, menus, and mouse pointers, allowing users to interact with computers through mouse clicks and intuitive graphical operations. This greatly lowered the barrier to using computers, making them more easily accepted and used by ordinary users, thus promoting the popularity of personal computers.

Even so, GUI also has its own limitations:

1. Interaction complexity rises with business complexity, leading to cumbersome processes;

2. Function bloat leads to a cluttered interface and high learning costs;

3. This problem is particularly prominent in enterprise-level B-side software.

A screenshot of the user interface in the SAP ERP ECC 6.0 (place an order)

CUI: The Era of AI-Driven Natural Interaction

Today, due to the development of AI-related technologies, a better solution has emerged—CUI (Conversational User Interface), which addresses the issues of both CLI and GUI: The development of AI technology has brought new possibilities for CUI, a conversational user interface that takes into account the flexibility of CLI and the usability of GUI, and is becoming a new fulcrum in future interaction design.

AI can understand and process human natural language, grasp user intentions, and users do not need to train and memorize a large number of command parameters, but can input in natural language.

Conversational AI Helps Users Learn PythonALT

Once the AI Agent understands user intentions and goals, it can autonomously break down and execute tasks. This simplifies work that originally required complex operational processes into a process where the Agent autonomously judges and executes. During this process, users only need to “guide” the Agent to complete tasks through dialogue. Of course, the Agent will also optionally confirm and make decisions regarding execution actions. Understanding user emotions, historical behaviors, and preferences and responding accordingly is more personalized and emotional than in the past.

CLI → GUI → CUI Interaction EvolutionALT

The evolution of interaction from CLI to GUI and then to CUI reflects the shift of computers from “making people adapt” to “actively understanding people,” propelling human-computer relations into a more natural and intelligent new era.

Can CUI Completely Replace GUI?

From the current application practice, CUI (Conversational User Interface) cannot completely replace GUI (Graphical User Interface); both have their own advantages in different scenarios and will coexist for a long time in the future.

“GUI has irreplaceable advantages in information-intensive tasks and concurrent operations (such as financial reports, design tools).”

——Nielsen Norman Group

Moreover, research from the Nielsen Norman Group has pointed out that appropriately introducing prompt controls in generative AI chatbots can improve functionality discoverability, reduce user input, and increase interaction efficiency.

Prompt Controls in GenAI Chatbots: 4 Main Uses and Best Practices – Feifei Liu, Nielsen Norman GroupALT

The Core Advantages of GUI

1. Powerful multitasking capability

With visual elements such as menus, icons, and buttons, GUI supports multi-window collaboration and multi-modal input (mouse, keyboard, touch), making it very suitable for frequently operated scenarios like finance and design.

2. High information density and strong conveying efficiency

Charts and graphics can intuitively present complex data, helping users quickly grasp key information.

3. Precision and efficiency in operations

For frequent and clear command operations, clicks, drags, and shortcuts in GUI have inherent advantages.

4. Clear interface structure with minimal semantic ambiguity

Defined visual hierarchies and control feedback mechanisms reduce user understanding costs and enhance interaction certainty.

Thus, the rise of CUI does not mean the disappearance of GUI. On the contrary, future human-computer interaction will present a multimodal fusion trend—CUI, GUI, and various interaction methods such as voice, images, and gestures will merge and develop together to create a more natural and efficient user experience.

Multimodality: Integration is the Future Direction

Multimodality refers to the integration of different perception and input modalities (such as language, images, voice, gestures, etc.), which is key to enhancing the naturalness, efficiency, and flexibility of human-computer interaction.

Microsoft Design (2022): “Multimodal interfaces—blending visual, voice, and gesture input—will dominate user experience evolution.”

In practical applications, users can flexibly use voice, text, mouse, gestures, etc., according to the scene, while the system can output results in the most suitable form, such as charts, text, or voice, based on intent and content. For example, quality inspectors can complete inspections and reports directly through voice, freeing themselves from paper and equipment switching, significantly improving frontline operational efficiency.

Multimodal UI is evolving towards “the fusion of GUI + CUI.” For example, in Microsoft 365 Copilot and Notion AI, the interface is no longer a static toolbar but includes a “floating dialogue layer” that responds to semantic input.

Microsoft 365 Copilot & Newhecloud AI MES CopilotALT

AI is no longer just an external tool but is embedded in the system, becoming an intelligent assistant that can be summoned and respond to semantics at any time:

  • Can hover in a floating window on the interface, responding at will;

  • Embedded in side panels, facilitating multi-threaded operations;

  • Even appearing flexibly as the mouse focus changes, becoming part of operational habits.

🗣️ “If you can speak, don’t click; if you can ask, don’t search” will become the mainstream interaction model.

Personalization: A Leap from Tool to Assistant

AI is evolving SaaS software from “general tools” to “exclusive assistants.”

AI is no longer just passively waiting to be called but is actively understanding users, pushing information, and providing suggestions. This is a “tailored experience” driven by data and AI. Each user logging into the system sees content that is most suitable for their role and context:

  • Production supervisors entering MES will see today’s most critical work orders automatically pop up;

  • Before visiting clients, CRM pushes the latest trends and customized suggestions related to those clients.

Typical Cases:

Salesforce Einstein: Uses machine learning to predict customer needs, providing personalized suggestions for sales teams, increasing conversion rates by more than 30%.

HubSpot: AI automatically builds customer journey maps, enhancing marketing reach efficiency, with customer participation increasing by 25%.

Einstein Assistant Generates Customer Descriptions Automatically Identifying Potential Opportunities for Sales StaffALT

The Exploration and Practice of Newhecloud

Facing the global trend of AI development, as an industry leader in the AI MES field, Newhecloud is deeply cultivating the manufacturing industry, actively promoting the application of AI technology in MES systems to enhance the intelligence level of the systems. Newhecloud has also combined its business to provide practical application solutions based on its understanding of the user interface (UI) and user experience design (UX) of the integration of AI and SaaS. Below, we will take two typical scenarios that more than 70% of Newhecloud users are concerned about as examples:

View Order Production Progress

Production managers or sales staff want to grasp the current production progress of sales orders in real time, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc. In traditional operational processes, production managers or sales staff need to go through several steps to view the current production progress of sales orders, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc.

According to past interactions, users need to:

Steps for checking order production progress - before using AI conversational interactionALT

A total of 8 steps, if there are multiple production orders, they would need to return and click on the next production order details, and each additional production order increases the operation steps by 2.

After introducing AI conversational interaction:

Steps for checking order production progress - after using AI conversational interactionALT

Only three steps, find the desired sales order and inquire through dialogue. The AI recognizes the user's intent and automatically obtains the order number on the current page to perform correlated queries, finding multiple associated production orders and returning the corresponding key information, such as: production status, process steps, and process progress. If users want to know more detailed production conditions, they can also jump directly to the corresponding production order detail.

Newhecloud AI MES - View Order Production Progress

Modify Entry and Exit Records

During the daily material management process, due to operational errors, inventory discrepancies, and data anomalies, it is necessary to correct or adjust a certain outgoing or incoming record in the system to ensure the consistency of accounts and reality, avoiding impacts on production, inventory, or financial data.

According to past interactions, users need to:

Steps for modifying entry and exit records - before using AI conversational interactionALT

A total of 8 steps. If there are multiple applicable entry and exit records, they would need to click “Undo” one by one, and each additional entry and exit record increases the operation steps by 1.

After using AI conversational interaction:

Steps for modifying entry and exit records - after using AI conversational interactionALT

Only three steps. Users do not need to go specifically to the entry and exit history; they can summon the “assistant” from any interface and tell the “assistant” the query criteria in natural language. The AI recognizes the user's intent and automatically filters based on the criteria to find the multiple entry and exit records that meet the conditions and presents them in a table form that it deems reasonable (a form of multimodal application); after confirming the records are correct, they tell the “assistant” to undo, and the AI recognizes the intent and calls the corresponding interface to perform the operation.

Newhecloud AI MES - Query Records

Newhecloud AI MES - Execute Undo Operation

Through the comparison of these two typical scenarios, it can be observed that users only need to plainly express their purpose and make judgments, and a series of specific execution actions in achieving the purpose can be replaced by the “assistant.” This can significantly reduce the user’s operation steps and greatly lower the learning cost of software like MES, ERP, CRM, enabling users to practically start using the software with almost 0 cost. The integration of this primarily relies on the MCP technology mentioned in our previous article https://www.xinheyun.com/blog/why-mcp.

This kind of conversational interaction entry will be increasingly integrated into the Newhecloud AI MES system. We ensure that such conversational entry points can be used anytime, anywhere, to maximally meet user usage scenarios and truly become an intelligent “assistant” that is always available.

Meanwhile, to meet personalized enterprise needs, Newhecloud AI MES will also fully open these entry points, allowing customers to customize personalized functional scenarios through OpenAPI and plugins and embed them into their own business processes, thus maximizing the need for flexible access and deep customization.

Conclusion and Outlook

The deep integration of AI technology is reshaping the interaction logic and user experience of enterprise-level SaaS software. From personalized recommendations to process automation, from natural language interactions to intelligent decision support, AI has not only significantly enhanced system efficiency and user stickiness but also promoted SaaS’s transformation from traditional “tools” to “intelligent partners.”

Newhecloud is committed to promoting this change. In our AI-native architecture, conversational interactions (CUI) are gradually becoming the main entrance, coexisting with graphical interfaces (GUI) to provide users with a more intelligent, natural, and seamless operation experience.

From the evolution of interaction forms, human-computer interaction continues to move towards a higher level of humanization, contextualization, and autonomy. From command line CLI to graphical interface GUI, and then to voice and natural language driven CUI, each leap represents a continuous optimization of “efficiency” and “experience.” The truly ideal interaction is one where technology hides behind needs—users do not need to learn complex operations; they only need to express their intentions, and the system can automatically understand and complete tasks.

The future UI will no longer be dominated by any single form; rather, it will consist of a coexistence and evolution of multimodal fusion. What Newhecloud is building is precisely such an “always available, naturally responsive” AI MES system: dialogue equals operation, what you see is what you get, helping users bridge the understanding gap between humans and systems, moving from passive response to proactive service, and truly achieving intelligent and personalized enterprise digital experiences.

Sources cited in the article:

https://www.gartner.com/cn/information-technology/articles/ai-agents

https://microsoft.design/articles/ux-design-for-agents

https://www.aufaitux.com/blog/ai-automation-ui-ux-design-tools/

Introduction

💡 According to Gartner's latest prediction, the proportion of integrated autonomous AI in enterprise software will surge from less than 1% in 2024 to 33% in 2028. Meanwhile, over 15% of daily work decisions will be autonomously completed by AI agents.

AI Integration in Enterprise Software: 2024–2028ALT

So, how should SaaS companies respond when AI becomes the dominant paradigm of interaction?

With the rapid development of AI technologies such as Large Language Models (LLM), Agents, and Model Context Protocols (MCP), enterprise-level SaaS software is undergoing a profound transformation in interaction paradigms. In business applications such as Manufacturing Execution Systems (MES), Enterprise Resource Planning (ERP), and Customer Relationship Management (CRM), AI is no longer just a supporter of backend algorithms; it is gradually becoming the core force that dominates the human-computer interaction experience, reshaping the way software is used and valued.

AI is Reshaping New Interaction Experiences

At the end of 2022, OpenAI released ChatGPT, which ignited the entire internet. “Conversational interaction” has become the preferred design choice for AI products. However, this text dialogue interaction method has similar forms that have existed for a long time, namely CLI (Command Line Interface).

CLI: Professional Interaction Driven by Early Technology

CLI is an interaction form from the early development of modern computers.

Early computers had limited performance and were mainly used for scientific computing and professional fields, with users mostly being technical personnel. At that time, computer systems were text-based and operated through command line interfaces. Users needed to input specific commands to execute various tasks. For example, in UNIX systems, entering the “ls” command displays the directory list, and the “cd” command switches directories.

Characteristics of CLI include:

  • Need to memorize a large number of commands and parameters;

  • Low degree of visualization, not intuitive enough;

  • High technical threshold requirements;

This interface requires users to memorize a large number of commands and their parameters, which has a high technical requirement for users, making it difficult to get started, and the visualization of the interaction process is low and not intuitive or friendly. However, due to its low consumption of system resources, it is still widely used in development and operations scenarios.

BlueOS CLI: Developers use command line to manage plugins and launch optionsALT

GUI: Graphical User Interface Brings Computers to the Public

From the late 1970s to the early 1980s, led by Xerox, Apple, and Microsoft, GUI (Graphical User Interface) gradually developed and expanded. GUI uses graphical elements such as windows, icons, menus, and mouse pointers, allowing users to interact with computers through mouse clicks and intuitive graphical operations. This greatly lowered the barrier to using computers, making them more easily accepted and used by ordinary users, thus promoting the popularity of personal computers.

Even so, GUI also has its own limitations:

1. Interaction complexity rises with business complexity, leading to cumbersome processes;

2. Function bloat leads to a cluttered interface and high learning costs;

3. This problem is particularly prominent in enterprise-level B-side software.

A screenshot of the user interface in the SAP ERP ECC 6.0 (place an order)

CUI: The Era of AI-Driven Natural Interaction

Today, due to the development of AI-related technologies, a better solution has emerged—CUI (Conversational User Interface), which addresses the issues of both CLI and GUI: The development of AI technology has brought new possibilities for CUI, a conversational user interface that takes into account the flexibility of CLI and the usability of GUI, and is becoming a new fulcrum in future interaction design.

AI can understand and process human natural language, grasp user intentions, and users do not need to train and memorize a large number of command parameters, but can input in natural language.

Conversational AI Helps Users Learn PythonALT

Once the AI Agent understands user intentions and goals, it can autonomously break down and execute tasks. This simplifies work that originally required complex operational processes into a process where the Agent autonomously judges and executes. During this process, users only need to “guide” the Agent to complete tasks through dialogue. Of course, the Agent will also optionally confirm and make decisions regarding execution actions. Understanding user emotions, historical behaviors, and preferences and responding accordingly is more personalized and emotional than in the past.

CLI → GUI → CUI Interaction EvolutionALT

The evolution of interaction from CLI to GUI and then to CUI reflects the shift of computers from “making people adapt” to “actively understanding people,” propelling human-computer relations into a more natural and intelligent new era.

Can CUI Completely Replace GUI?

From the current application practice, CUI (Conversational User Interface) cannot completely replace GUI (Graphical User Interface); both have their own advantages in different scenarios and will coexist for a long time in the future.

“GUI has irreplaceable advantages in information-intensive tasks and concurrent operations (such as financial reports, design tools).”

——Nielsen Norman Group

Moreover, research from the Nielsen Norman Group has pointed out that appropriately introducing prompt controls in generative AI chatbots can improve functionality discoverability, reduce user input, and increase interaction efficiency.

Prompt Controls in GenAI Chatbots: 4 Main Uses and Best Practices – Feifei Liu, Nielsen Norman GroupALT

The Core Advantages of GUI

1. Powerful multitasking capability

With visual elements such as menus, icons, and buttons, GUI supports multi-window collaboration and multi-modal input (mouse, keyboard, touch), making it very suitable for frequently operated scenarios like finance and design.

2. High information density and strong conveying efficiency

Charts and graphics can intuitively present complex data, helping users quickly grasp key information.

3. Precision and efficiency in operations

For frequent and clear command operations, clicks, drags, and shortcuts in GUI have inherent advantages.

4. Clear interface structure with minimal semantic ambiguity

Defined visual hierarchies and control feedback mechanisms reduce user understanding costs and enhance interaction certainty.

Thus, the rise of CUI does not mean the disappearance of GUI. On the contrary, future human-computer interaction will present a multimodal fusion trend—CUI, GUI, and various interaction methods such as voice, images, and gestures will merge and develop together to create a more natural and efficient user experience.

Multimodality: Integration is the Future Direction

Multimodality refers to the integration of different perception and input modalities (such as language, images, voice, gestures, etc.), which is key to enhancing the naturalness, efficiency, and flexibility of human-computer interaction.

Microsoft Design (2022): “Multimodal interfaces—blending visual, voice, and gesture input—will dominate user experience evolution.”

In practical applications, users can flexibly use voice, text, mouse, gestures, etc., according to the scene, while the system can output results in the most suitable form, such as charts, text, or voice, based on intent and content. For example, quality inspectors can complete inspections and reports directly through voice, freeing themselves from paper and equipment switching, significantly improving frontline operational efficiency.

Multimodal UI is evolving towards “the fusion of GUI + CUI.” For example, in Microsoft 365 Copilot and Notion AI, the interface is no longer a static toolbar but includes a “floating dialogue layer” that responds to semantic input.

Microsoft 365 Copilot & Newhecloud AI MES CopilotALT

AI is no longer just an external tool but is embedded in the system, becoming an intelligent assistant that can be summoned and respond to semantics at any time:

  • Can hover in a floating window on the interface, responding at will;

  • Embedded in side panels, facilitating multi-threaded operations;

  • Even appearing flexibly as the mouse focus changes, becoming part of operational habits.

🗣️ “If you can speak, don’t click; if you can ask, don’t search” will become the mainstream interaction model.

Personalization: A Leap from Tool to Assistant

AI is evolving SaaS software from “general tools” to “exclusive assistants.”

AI is no longer just passively waiting to be called but is actively understanding users, pushing information, and providing suggestions. This is a “tailored experience” driven by data and AI. Each user logging into the system sees content that is most suitable for their role and context:

  • Production supervisors entering MES will see today’s most critical work orders automatically pop up;

  • Before visiting clients, CRM pushes the latest trends and customized suggestions related to those clients.

Typical Cases:

Salesforce Einstein: Uses machine learning to predict customer needs, providing personalized suggestions for sales teams, increasing conversion rates by more than 30%.

HubSpot: AI automatically builds customer journey maps, enhancing marketing reach efficiency, with customer participation increasing by 25%.

Einstein Assistant Generates Customer Descriptions Automatically Identifying Potential Opportunities for Sales StaffALT

The Exploration and Practice of Newhecloud

Facing the global trend of AI development, as an industry leader in the AI MES field, Newhecloud is deeply cultivating the manufacturing industry, actively promoting the application of AI technology in MES systems to enhance the intelligence level of the systems. Newhecloud has also combined its business to provide practical application solutions based on its understanding of the user interface (UI) and user experience design (UX) of the integration of AI and SaaS. Below, we will take two typical scenarios that more than 70% of Newhecloud users are concerned about as examples:

View Order Production Progress

Production managers or sales staff want to grasp the current production progress of sales orders in real time, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc. In traditional operational processes, production managers or sales staff need to go through several steps to view the current production progress of sales orders, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc.

According to past interactions, users need to:

Steps for checking order production progress - before using AI conversational interactionALT

A total of 8 steps, if there are multiple production orders, they would need to return and click on the next production order details, and each additional production order increases the operation steps by 2.

After introducing AI conversational interaction:

Steps for checking order production progress - after using AI conversational interactionALT

Only three steps, find the desired sales order and inquire through dialogue. The AI recognizes the user's intent and automatically obtains the order number on the current page to perform correlated queries, finding multiple associated production orders and returning the corresponding key information, such as: production status, process steps, and process progress. If users want to know more detailed production conditions, they can also jump directly to the corresponding production order detail.

Newhecloud AI MES - View Order Production Progress

Modify Entry and Exit Records

During the daily material management process, due to operational errors, inventory discrepancies, and data anomalies, it is necessary to correct or adjust a certain outgoing or incoming record in the system to ensure the consistency of accounts and reality, avoiding impacts on production, inventory, or financial data.

According to past interactions, users need to:

Steps for modifying entry and exit records - before using AI conversational interactionALT

A total of 8 steps. If there are multiple applicable entry and exit records, they would need to click “Undo” one by one, and each additional entry and exit record increases the operation steps by 1.

After using AI conversational interaction:

Steps for modifying entry and exit records - after using AI conversational interactionALT

Only three steps. Users do not need to go specifically to the entry and exit history; they can summon the “assistant” from any interface and tell the “assistant” the query criteria in natural language. The AI recognizes the user's intent and automatically filters based on the criteria to find the multiple entry and exit records that meet the conditions and presents them in a table form that it deems reasonable (a form of multimodal application); after confirming the records are correct, they tell the “assistant” to undo, and the AI recognizes the intent and calls the corresponding interface to perform the operation.

Newhecloud AI MES - Query Records

Newhecloud AI MES - Execute Undo Operation

Through the comparison of these two typical scenarios, it can be observed that users only need to plainly express their purpose and make judgments, and a series of specific execution actions in achieving the purpose can be replaced by the “assistant.” This can significantly reduce the user’s operation steps and greatly lower the learning cost of software like MES, ERP, CRM, enabling users to practically start using the software with almost 0 cost. The integration of this primarily relies on the MCP technology mentioned in our previous article https://www.xinheyun.com/blog/why-mcp.

This kind of conversational interaction entry will be increasingly integrated into the Newhecloud AI MES system. We ensure that such conversational entry points can be used anytime, anywhere, to maximally meet user usage scenarios and truly become an intelligent “assistant” that is always available.

Meanwhile, to meet personalized enterprise needs, Newhecloud AI MES will also fully open these entry points, allowing customers to customize personalized functional scenarios through OpenAPI and plugins and embed them into their own business processes, thus maximizing the need for flexible access and deep customization.

Conclusion and Outlook

The deep integration of AI technology is reshaping the interaction logic and user experience of enterprise-level SaaS software. From personalized recommendations to process automation, from natural language interactions to intelligent decision support, AI has not only significantly enhanced system efficiency and user stickiness but also promoted SaaS’s transformation from traditional “tools” to “intelligent partners.”

Newhecloud is committed to promoting this change. In our AI-native architecture, conversational interactions (CUI) are gradually becoming the main entrance, coexisting with graphical interfaces (GUI) to provide users with a more intelligent, natural, and seamless operation experience.

From the evolution of interaction forms, human-computer interaction continues to move towards a higher level of humanization, contextualization, and autonomy. From command line CLI to graphical interface GUI, and then to voice and natural language driven CUI, each leap represents a continuous optimization of “efficiency” and “experience.” The truly ideal interaction is one where technology hides behind needs—users do not need to learn complex operations; they only need to express their intentions, and the system can automatically understand and complete tasks.

The future UI will no longer be dominated by any single form; rather, it will consist of a coexistence and evolution of multimodal fusion. What Newhecloud is building is precisely such an “always available, naturally responsive” AI MES system: dialogue equals operation, what you see is what you get, helping users bridge the understanding gap between humans and systems, moving from passive response to proactive service, and truly achieving intelligent and personalized enterprise digital experiences.

Sources cited in the article:

https://www.gartner.com/cn/information-technology/articles/ai-agents

https://microsoft.design/articles/ux-design-for-agents

https://www.aufaitux.com/blog/ai-automation-ui-ux-design-tools/

Introduction

💡 According to Gartner's latest prediction, the proportion of integrated autonomous AI in enterprise software will surge from less than 1% in 2024 to 33% in 2028. Meanwhile, over 15% of daily work decisions will be autonomously completed by AI agents.

AI Integration in Enterprise Software: 2024–2028ALT

So, how should SaaS companies respond when AI becomes the dominant paradigm of interaction?

With the rapid development of AI technologies such as Large Language Models (LLM), Agents, and Model Context Protocols (MCP), enterprise-level SaaS software is undergoing a profound transformation in interaction paradigms. In business applications such as Manufacturing Execution Systems (MES), Enterprise Resource Planning (ERP), and Customer Relationship Management (CRM), AI is no longer just a supporter of backend algorithms; it is gradually becoming the core force that dominates the human-computer interaction experience, reshaping the way software is used and valued.

AI is Reshaping New Interaction Experiences

At the end of 2022, OpenAI released ChatGPT, which ignited the entire internet. “Conversational interaction” has become the preferred design choice for AI products. However, this text dialogue interaction method has similar forms that have existed for a long time, namely CLI (Command Line Interface).

CLI: Professional Interaction Driven by Early Technology

CLI is an interaction form from the early development of modern computers.

Early computers had limited performance and were mainly used for scientific computing and professional fields, with users mostly being technical personnel. At that time, computer systems were text-based and operated through command line interfaces. Users needed to input specific commands to execute various tasks. For example, in UNIX systems, entering the “ls” command displays the directory list, and the “cd” command switches directories.

Characteristics of CLI include:

  • Need to memorize a large number of commands and parameters;

  • Low degree of visualization, not intuitive enough;

  • High technical threshold requirements;

This interface requires users to memorize a large number of commands and their parameters, which has a high technical requirement for users, making it difficult to get started, and the visualization of the interaction process is low and not intuitive or friendly. However, due to its low consumption of system resources, it is still widely used in development and operations scenarios.

BlueOS CLI: Developers use command line to manage plugins and launch optionsALT

GUI: Graphical User Interface Brings Computers to the Public

From the late 1970s to the early 1980s, led by Xerox, Apple, and Microsoft, GUI (Graphical User Interface) gradually developed and expanded. GUI uses graphical elements such as windows, icons, menus, and mouse pointers, allowing users to interact with computers through mouse clicks and intuitive graphical operations. This greatly lowered the barrier to using computers, making them more easily accepted and used by ordinary users, thus promoting the popularity of personal computers.

Even so, GUI also has its own limitations:

1. Interaction complexity rises with business complexity, leading to cumbersome processes;

2. Function bloat leads to a cluttered interface and high learning costs;

3. This problem is particularly prominent in enterprise-level B-side software.

A screenshot of the user interface in the SAP ERP ECC 6.0 (place an order)

CUI: The Era of AI-Driven Natural Interaction

Today, due to the development of AI-related technologies, a better solution has emerged—CUI (Conversational User Interface), which addresses the issues of both CLI and GUI: The development of AI technology has brought new possibilities for CUI, a conversational user interface that takes into account the flexibility of CLI and the usability of GUI, and is becoming a new fulcrum in future interaction design.

AI can understand and process human natural language, grasp user intentions, and users do not need to train and memorize a large number of command parameters, but can input in natural language.

Conversational AI Helps Users Learn PythonALT

Once the AI Agent understands user intentions and goals, it can autonomously break down and execute tasks. This simplifies work that originally required complex operational processes into a process where the Agent autonomously judges and executes. During this process, users only need to “guide” the Agent to complete tasks through dialogue. Of course, the Agent will also optionally confirm and make decisions regarding execution actions. Understanding user emotions, historical behaviors, and preferences and responding accordingly is more personalized and emotional than in the past.

CLI → GUI → CUI Interaction EvolutionALT

The evolution of interaction from CLI to GUI and then to CUI reflects the shift of computers from “making people adapt” to “actively understanding people,” propelling human-computer relations into a more natural and intelligent new era.

Can CUI Completely Replace GUI?

From the current application practice, CUI (Conversational User Interface) cannot completely replace GUI (Graphical User Interface); both have their own advantages in different scenarios and will coexist for a long time in the future.

“GUI has irreplaceable advantages in information-intensive tasks and concurrent operations (such as financial reports, design tools).”

——Nielsen Norman Group

Moreover, research from the Nielsen Norman Group has pointed out that appropriately introducing prompt controls in generative AI chatbots can improve functionality discoverability, reduce user input, and increase interaction efficiency.

Prompt Controls in GenAI Chatbots: 4 Main Uses and Best Practices – Feifei Liu, Nielsen Norman GroupALT

The Core Advantages of GUI

1. Powerful multitasking capability

With visual elements such as menus, icons, and buttons, GUI supports multi-window collaboration and multi-modal input (mouse, keyboard, touch), making it very suitable for frequently operated scenarios like finance and design.

2. High information density and strong conveying efficiency

Charts and graphics can intuitively present complex data, helping users quickly grasp key information.

3. Precision and efficiency in operations

For frequent and clear command operations, clicks, drags, and shortcuts in GUI have inherent advantages.

4. Clear interface structure with minimal semantic ambiguity

Defined visual hierarchies and control feedback mechanisms reduce user understanding costs and enhance interaction certainty.

Thus, the rise of CUI does not mean the disappearance of GUI. On the contrary, future human-computer interaction will present a multimodal fusion trend—CUI, GUI, and various interaction methods such as voice, images, and gestures will merge and develop together to create a more natural and efficient user experience.

Multimodality: Integration is the Future Direction

Multimodality refers to the integration of different perception and input modalities (such as language, images, voice, gestures, etc.), which is key to enhancing the naturalness, efficiency, and flexibility of human-computer interaction.

Microsoft Design (2022): “Multimodal interfaces—blending visual, voice, and gesture input—will dominate user experience evolution.”

In practical applications, users can flexibly use voice, text, mouse, gestures, etc., according to the scene, while the system can output results in the most suitable form, such as charts, text, or voice, based on intent and content. For example, quality inspectors can complete inspections and reports directly through voice, freeing themselves from paper and equipment switching, significantly improving frontline operational efficiency.

Multimodal UI is evolving towards “the fusion of GUI + CUI.” For example, in Microsoft 365 Copilot and Notion AI, the interface is no longer a static toolbar but includes a “floating dialogue layer” that responds to semantic input.

Microsoft 365 Copilot & Newhecloud AI MES CopilotALT

AI is no longer just an external tool but is embedded in the system, becoming an intelligent assistant that can be summoned and respond to semantics at any time:

  • Can hover in a floating window on the interface, responding at will;

  • Embedded in side panels, facilitating multi-threaded operations;

  • Even appearing flexibly as the mouse focus changes, becoming part of operational habits.

🗣️ “If you can speak, don’t click; if you can ask, don’t search” will become the mainstream interaction model.

Personalization: A Leap from Tool to Assistant

AI is evolving SaaS software from “general tools” to “exclusive assistants.”

AI is no longer just passively waiting to be called but is actively understanding users, pushing information, and providing suggestions. This is a “tailored experience” driven by data and AI. Each user logging into the system sees content that is most suitable for their role and context:

  • Production supervisors entering MES will see today’s most critical work orders automatically pop up;

  • Before visiting clients, CRM pushes the latest trends and customized suggestions related to those clients.

Typical Cases:

Salesforce Einstein: Uses machine learning to predict customer needs, providing personalized suggestions for sales teams, increasing conversion rates by more than 30%.

HubSpot: AI automatically builds customer journey maps, enhancing marketing reach efficiency, with customer participation increasing by 25%.

Einstein Assistant Generates Customer Descriptions Automatically Identifying Potential Opportunities for Sales StaffALT

The Exploration and Practice of Newhecloud

Facing the global trend of AI development, as an industry leader in the AI MES field, Newhecloud is deeply cultivating the manufacturing industry, actively promoting the application of AI technology in MES systems to enhance the intelligence level of the systems. Newhecloud has also combined its business to provide practical application solutions based on its understanding of the user interface (UI) and user experience design (UX) of the integration of AI and SaaS. Below, we will take two typical scenarios that more than 70% of Newhecloud users are concerned about as examples:

View Order Production Progress

Production managers or sales staff want to grasp the current production progress of sales orders in real time, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc. In traditional operational processes, production managers or sales staff need to go through several steps to view the current production progress of sales orders, including the completion status of each process, number of in-process products, quantity completed, abnormal information, etc.

According to past interactions, users need to:

Steps for checking order production progress - before using AI conversational interactionALT

A total of 8 steps, if there are multiple production orders, they would need to return and click on the next production order details, and each additional production order increases the operation steps by 2.

After introducing AI conversational interaction:

Steps for checking order production progress - after using AI conversational interactionALT

Only three steps, find the desired sales order and inquire through dialogue. The AI recognizes the user's intent and automatically obtains the order number on the current page to perform correlated queries, finding multiple associated production orders and returning the corresponding key information, such as: production status, process steps, and process progress. If users want to know more detailed production conditions, they can also jump directly to the corresponding production order detail.

Newhecloud AI MES - View Order Production Progress

Modify Entry and Exit Records

During the daily material management process, due to operational errors, inventory discrepancies, and data anomalies, it is necessary to correct or adjust a certain outgoing or incoming record in the system to ensure the consistency of accounts and reality, avoiding impacts on production, inventory, or financial data.

According to past interactions, users need to:

Steps for modifying entry and exit records - before using AI conversational interactionALT

A total of 8 steps. If there are multiple applicable entry and exit records, they would need to click “Undo” one by one, and each additional entry and exit record increases the operation steps by 1.

After using AI conversational interaction:

Steps for modifying entry and exit records - after using AI conversational interactionALT

Only three steps. Users do not need to go specifically to the entry and exit history; they can summon the “assistant” from any interface and tell the “assistant” the query criteria in natural language. The AI recognizes the user's intent and automatically filters based on the criteria to find the multiple entry and exit records that meet the conditions and presents them in a table form that it deems reasonable (a form of multimodal application); after confirming the records are correct, they tell the “assistant” to undo, and the AI recognizes the intent and calls the corresponding interface to perform the operation.

Newhecloud AI MES - Query Records

Newhecloud AI MES - Execute Undo Operation

Through the comparison of these two typical scenarios, it can be observed that users only need to plainly express their purpose and make judgments, and a series of specific execution actions in achieving the purpose can be replaced by the “assistant.” This can significantly reduce the user’s operation steps and greatly lower the learning cost of software like MES, ERP, CRM, enabling users to practically start using the software with almost 0 cost. The integration of this primarily relies on the MCP technology mentioned in our previous article https://www.xinheyun.com/blog/why-mcp.

This kind of conversational interaction entry will be increasingly integrated into the Newhecloud AI MES system. We ensure that such conversational entry points can be used anytime, anywhere, to maximally meet user usage scenarios and truly become an intelligent “assistant” that is always available.

Meanwhile, to meet personalized enterprise needs, Newhecloud AI MES will also fully open these entry points, allowing customers to customize personalized functional scenarios through OpenAPI and plugins and embed them into their own business processes, thus maximizing the need for flexible access and deep customization.

Conclusion and Outlook

The deep integration of AI technology is reshaping the interaction logic and user experience of enterprise-level SaaS software. From personalized recommendations to process automation, from natural language interactions to intelligent decision support, AI has not only significantly enhanced system efficiency and user stickiness but also promoted SaaS’s transformation from traditional “tools” to “intelligent partners.”

Newhecloud is committed to promoting this change. In our AI-native architecture, conversational interactions (CUI) are gradually becoming the main entrance, coexisting with graphical interfaces (GUI) to provide users with a more intelligent, natural, and seamless operation experience.

From the evolution of interaction forms, human-computer interaction continues to move towards a higher level of humanization, contextualization, and autonomy. From command line CLI to graphical interface GUI, and then to voice and natural language driven CUI, each leap represents a continuous optimization of “efficiency” and “experience.” The truly ideal interaction is one where technology hides behind needs—users do not need to learn complex operations; they only need to express their intentions, and the system can automatically understand and complete tasks.

The future UI will no longer be dominated by any single form; rather, it will consist of a coexistence and evolution of multimodal fusion. What Newhecloud is building is precisely such an “always available, naturally responsive” AI MES system: dialogue equals operation, what you see is what you get, helping users bridge the understanding gap between humans and systems, moving from passive response to proactive service, and truly achieving intelligent and personalized enterprise digital experiences.

Sources cited in the article:

https://www.gartner.com/cn/information-technology/articles/ai-agents

https://microsoft.design/articles/ux-design-for-agents

https://www.aufaitux.com/blog/ai-automation-ui-ux-design-tools/

News

Our latest news & blogs

FAQs

Learn More? Contact us!

1. What types of discrete manufacturing is New Core Cloud suitable for?
2. What are New Core Cloud's competitive advantages in consumer electronics?
3. What is the typical implementation timeline for New Core Cloud?
4. Does the solution support end-to-end traceability?
5. How does New Core Cloud charge for its services
6. Can you integrate with ERP systems?
7. Is an English version available for international operations?

FAQs

Learn More? Contact us!

1. What types of discrete manufacturing is New Core Cloud suitable for?
2. What are New Core Cloud's competitive advantages in consumer electronics?
3. What is the typical implementation timeline for New Core Cloud?
4. Does the solution support end-to-end traceability?
5. How does New Core Cloud charge for its services
6. Can you integrate with ERP systems?
7. Is an English version available for international operations?

FAQs

Learn More? Contact us!

1. What types of discrete manufacturing is New Core Cloud suitable for?
2. What are New Core Cloud's competitive advantages in consumer electronics?
3. What is the typical implementation timeline for New Core Cloud?
4. Does the solution support end-to-end traceability?
5. How does New Core Cloud charge for its services
6. Can you integrate with ERP systems?
7. Is an English version available for international operations?

Contact us!

Learn more? Contact us !

+(86)400-164-1521

Headquarters: 10th Floor, Building A6, No. 1528, Gumei Road, Xuhui District, Shanghai, China

Singapore · Guangzhou · Chengdu · Hangzhou · Hefei · Nanjing · Shijiazhuang

Contact us!

Learn more? Contact us !

+(86)400-164-1521

Headquarters: 10th Floor, Building A6, No. 1528, Gumei Road, Xuhui District, Shanghai, China

Singapore · Guangzhou · Chengdu · Hangzhou · Hefei · Nanjing · Shijiazhuang

Contact us!

Learn more? Contact us !

+(86)400-164-1521

Headquarters: 10th Floor, Building A6, No. 1528, Gumei Road, Xuhui District, Shanghai, China

Singapore · Guangzhou · Chengdu · Hangzhou · Hefei · Nanjing · Shijiazhuang