Neuro-symbolic artificial intelligence (NeSy AI) is a rapidly evolving field that seeks to combine the perceptive abilities of neural networks with the logical reasoning strengths of symbolic systems. This hybrid approach is designed to address complex tasks that require both pattern recognition and deductive reasoning. NeSy systems aim to create more robust and generalizable AI…
The capacity of platooning technology to precisely control cars, optimize traffic flow, and increase energy economy is well known. Platooning reduces aerodynamic drag, boosts fuel efficiency, and expands road capacity by enabling vehicles to move in close proximity and in unison. However, a number of issues arise when it comes to large-scale mixed platoons, which…
Process mining is a part of data science concerned with analyzing event logs produced by information systems to learn about business processes. This paper addresses process mining techniques, which involve process discovery. All these are very important in organizations, especially in workflow optimization and enhancing efficiency and potential areas for improvement. One major problem in…
Logs provide important insights that are frequently the earliest signs of system problems, making them an essential tool for program maintenance and failure diagnostics. These logs must be effectively parsed for automated log analysis tasks like anomaly identification, troubleshooting, and root cause investigation. The act of turning semi-structured log messages into structured templates is known…
Deep generative models learn continuous data representations from a limited set of training samples, with global metrics like Fréchet Inception Distance (FID) often used to evaluate their performance. However, these models may perform inconsistently across different regions of the learned manifold, especially in foundation models like Stable Diffusion, where generation quality can vary based on…
Data visualizations (DVs) have become a common practice in the big data era, utilized by various applications and institutions to convey insights from massive raw data. However, creating suitable DVs remains a challenging task, even for experts, as it requires visual analysis expertise and familiarity with the domain data. Also, users must master complex declarative…
Automated design in artificial intelligence (AI) is an emerging field focusing on developing systems capable of independently generating and optimizing their components. This approach is built on the premise that machine learning can surpass the limitations of manual design, enabling the creation of more efficient, adaptable, and powerful AI systems. The aim is to allow…
Data centers are poised to be among the world’s largest electricity consumers. If there is no meaningful change, they will consume between 10% and 20% of the electricity used in the U.S. by 2030. This explosive energy demand is influenced by the increasing computational demand, especially for new generative AI applications. Growth at this rate…
Language models (LMs), while powerful in generating human-like text, often produce unstructured and inconsistent outputs. The lack of structure in responses poses challenges in real-world applications, especially in long and extensive responses. It becomes difficult to extract specific information, integrate with systems expecting structured data, and present information in formats like tables or lists that…
Quantum computing has shown great potential to transform specific algorithms and applications and is expected to work alongside traditional High-Performance Computing (HPC) environments. Moreover, Noisy Intermediate-Scale Quantum (NISQ) devices have emerged as powerful computational platforms, but they face challenges such as limited qubit coherence times and a high chance of errors. Due to the complexity…