Logs provide important insights that are frequently the earliest signs of system problems, making them an essential tool for program maintenance and failure diagnostics. These logs must be effectively parsed for automated log analysis tasks like anomaly identification, troubleshooting, and root cause investigation. The act of turning semi-structured log messages into structured templates is known…
Deep generative models learn continuous data representations from a limited set of training samples, with global metrics like Fréchet Inception Distance (FID) often used to evaluate their performance. However, these models may perform inconsistently across different regions of the learned manifold, especially in foundation models like Stable Diffusion, where generation quality can vary based on…
Data visualizations (DVs) have become a common practice in the big data era, utilized by various applications and institutions to convey insights from massive raw data. However, creating suitable DVs remains a challenging task, even for experts, as it requires visual analysis expertise and familiarity with the domain data. Also, users must master complex declarative…
Automated design in artificial intelligence (AI) is an emerging field focusing on developing systems capable of independently generating and optimizing their components. This approach is built on the premise that machine learning can surpass the limitations of manual design, enabling the creation of more efficient, adaptable, and powerful AI systems. The aim is to allow…
Data centers are poised to be among the world’s largest electricity consumers. If there is no meaningful change, they will consume between 10% and 20% of the electricity used in the U.S. by 2030. This explosive energy demand is influenced by the increasing computational demand, especially for new generative AI applications. Growth at this rate…
Language models (LMs), while powerful in generating human-like text, often produce unstructured and inconsistent outputs. The lack of structure in responses poses challenges in real-world applications, especially in long and extensive responses. It becomes difficult to extract specific information, integrate with systems expecting structured data, and present information in formats like tables or lists that…
Quantum computing has shown great potential to transform specific algorithms and applications and is expected to work alongside traditional High-Performance Computing (HPC) environments. Moreover, Noisy Intermediate-Scale Quantum (NISQ) devices have emerged as powerful computational platforms, but they face challenges such as limited qubit coherence times and a high chance of errors. Due to the complexity…
Computational social science (CSS) leverages advanced computational techniques to analyze and interpret vast amounts of social data. This field increasingly relies on natural language processing (NLP) methods to handle unstructured text data. However, while large language models (LLMs) have revolutionized CSS by enabling rapid and sophisticated text analysis, their integration into practical applications remains a…