Pular para o conteúdo principal

Postagens

Mostrando postagens com o rótulo deep learning

Chain-of-Thought Prompting Elicits Reasoning in Large Language Models - Leitura de Artigo

https://arxiv.org/pdf/2201.11903.pdf Chain-of-Thought (CoT) para elaborar perguntas aos LLMs mas as perguntas podem ser incompletas no que diz respeito ao contexto. A busca exploratório é formada por várias perguntas elaboradas ao longo do processo. Está relacionado com "Fine Tuning" na tarefa a ser executada pq está ensinando ao LLM como responder Introduction   However, scaling up model size alone has not proved sufficient for achieving high performance on challenging tasks such as arithmetic, commonsense, and symbolic reasoning (Raeet al., 2021). This work explores how the reasoning ability of large language models can be unlocked by a simple method motivated by two ideas. First, techniques for arithmetic reasoning can benefit from generating natural language rationales that lead to the final answer. Prior work has given models the ability to generate natural language intermediate steps by training from scratch (Ling et al., 2017) or finetuning a pretrained model (Cobbe e...

Knowledge graphs as tools for explainable machine learning: A survey

Link https://doi.org/10.1016/j.artint.2021.103627 Abstract This paper provides an extensive overview of the use of knowledge graphs in the context of Explainable Machine Learning . As of late, explainable AI has become a very active field of research by addressing the limitations of the latest machine learning solutions that often provide highly accurate, but hardly scrutable and interpretable decisions. An increasing interest has also been shown in the integration of Knowledge Representation techniques in Machine Learning applications, mostly motivated by the complementary strengths and weaknesses that could lead to a new generation of hybrid intelligent systems. Following this idea, we hypothesise that knowledge graphs, which naturally provide domain background knowledge in a machine-readable format, could be integrated in Explainable Machine Learning approaches to help them provide more meaningful, insightful and trustworthy explanations. 6. Current challenges (and ideas to...

Context-Aware Temporal Knowledge Graph Embedding - Leitura de Artigo

Yu Liu, Wen Hua, Kexuan Xin, and Xiaofang Zhou. 2020. Context-Aware Temporal Knowledge Graph Embedding. In Web Information Systems Engineering – WISE 2019: 20th International Conference, Hong Kong, China, January 19–22, 2020, Proceedings. Springer-Verlag, Berlin, Heidelberg, 583–598. https://doi.org/10.1007/978-3-030-34223-4_37 Abstract Knowledge graph embedding (KGE) is an important technique used for knowledge graph completion (KGC). However, knowledge in practice is time-variant and many relations are only valid for a certain period of time. This phenomenon highlights the importance of temporal knowledge graph embeddings. [Considerar o contexto temporal para a validade de relacionamentos] Currently, existing temporal KGE methods only focus on one aspect of facts, i.e., the factual plausibility, while ignoring the other aspect, i.e., the temporal consistency. Temporal consistency models the interactions between a fact and its contexts, and thus is able to capture fine-granularity te...

CoKE: Contextualized Knowledge Graph Embedding

 Quan Wang, Pingping Huang, Haifeng Wang, Songtai Dai, Wenbin Jiang, Jing Liu, Yajuan Lyu, Yong Zhu, Hua Wu: CoKE: Contextualized Knowledge Graph Embedding . CoRR abs/1911.02168 (2019) Abstract Knowledge graph embedding, which projects symbolic entities and relations into continuous vector spaces, is gaining increasing attention. Previous methods allow a single static embedding for each entity or relation, ignoring their intrinsic contextual nature, i.e., entities and relations may appear in different graph contexts, and accordingly, exhibit different properties. [O contexto de uma entidade ou relacionamento depende do grafo onde ela aparece e isso afeta o embeddings gerado] This work presents Contextualized Knowledge Graph Embedding (CoKE), a novel paradigm that takes into account such contextual nature, and learns dynamic, flexible, and fully contextualized entity and relation embeddings. Two types of graph contexts are studied: edges and paths, both formulated as sequences of en...