id stringlengths 14 15 | text stringlengths 27 2.12k | source stringlengths 49 118 |
|---|---|---|
b9e9af00ccda-7 | been traveling for over 13 billion years to reach us. • JWST has provided us with the first images of exoplanets, which are planets outside of our own solar system. These distant worlds were first discovered in 1992, and the JWST has allowed us to see them in greater detail. These discoveries can spark a child'... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-8 | > Finished chain. > Entering new LLMChain chain... Prompt after formatting: You are an expert fact checker. You have been hired by a major news organization to fact check a very important story. Here is a bullet point list of facts: """ • The James Webb Space Telescope (JWST) spotted... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-9 | spotted a number of galaxies nicknamed "green peas." - True • The light from these galaxies has been traveling for over 13 billion years to reach us. - True • JWST has provided us with the first images of exoplanets, which are planets outside of our own solar system. - False. The first exoplanet was... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-10 | greater detail. These discoveries can spark a child's imagination about the infinite wonders of the universe. """ Using these checked assertions, rewrite the original summary to be completely true. The output should have the same structure and formatting as the original summary. Summary: ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-11 | • The James Webb Space Telescope (JWST) spotted a number of galaxies nicknamed "green peas." - True • The light from these galaxies has been traveling for over 13 billion years to reach us. - True • JWST has provided us with the first images of exoplanets, which are planets outside of our own sola... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-12 | in 1992. The JWST will allow us to see them in greater detail when it is launched in 2023. These discoveries can spark a child's imagination about the infinite wonders of the universe. > Finished chain. 'Your 9-year old might like these recent discoveries made by The James Webb Space Telescope (JWST):\n•... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-13 | The sea is named after the island of Greenland, and is the Arctic Ocean's main outlet to the Atlantic. It is often frozen over so navigation is limited, and is considered the northern branch of the Norwegian Sea."checker_chain.run(text) > Entering new LLMSummarizationCheckerChain chain... > Enteri... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-14 | important story. Here is a bullet point list of facts: """ - The Greenland Sea is an outlying portion of the Arctic Ocean located between Iceland, Norway, the Svalbard archipelago and Greenland. - It has an area of 465,000 square miles. - It is one of five oceans in the world, alongside the Pacif... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-15 | Norway, the Svalbard archipelago and Greenland. True - It has an area of 465,000 square miles. True - It is one of five oceans in the world, alongside the Pacific Ocean, Atlantic Ocean, Indian Ocean, and the Southern Ocean. False - The Greenland Sea is not an ocean, it is an arm of the Arctic Ocean. ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-16 | is limited, and is considered the northern branch of the Norwegian Sea. """ Using these checked assertions, rewrite the original summary to be completely true. The output should have the same structure and formatting as the original summary. Summary: > Finished chain. > Enterin... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-17 | portion of the Arctic Ocean located between Iceland, Norway, the Svalbard archipelago and Greenland. True - It has an area of 465,000 square miles. True - It is one of five oceans in the world, alongside the Pacific Ocean, Atlantic Ocean, Indian Ocean, and the Southern Ocean. False - The Greenland Sea is ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-18 | navigation is limited, and is considered the northern branch of the Norwegian Sea. > Entering new SequentialChain chain... > Entering new LLMChain chain... Prompt after formatting: Given some text, extract a list of facts from the text. Format your output as a bulleted list. Te... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-19 | - It is an arm of the Arctic Ocean. - It is covered almost entirely by water, some of which is frozen in the form of glaciers and icebergs. - It is named after the island of Greenland. - It is the Arctic Ocean's main outlet to the Atlantic. - It is often frozen over so navigation is limited. - It is cons... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-20 | of Greenland. - It is the Arctic Ocean's main outlet to the Atlantic. True - It is often frozen over so navigation is limited. True - It is considered the northern branch of the Norwegian Sea. False - It is considered the northern branch of the Atlantic Ocean. """ Original Summary: """... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-21 | Checked Assertions: """ - The sky is red: False - Water is made of lava: False - The sun is a star: True """ Result: False === Checked Assertions: """ - The sky is blue: True - Water is wet: True - The sun is a star: True """ Result: True === Checked Assertions:... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-22 | False - It is considered the northern branch of the Atlantic Ocean. """ Result: > Finished chain. > Finished chain. The Greenland Sea is an outlying portion of the Arctic Ocean located between Iceland, Norway, the Svalbard archipelago and Greenland. It has an area of 465,000 square miles ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-23 | of the Atlantic Ocean. """ Facts: > Finished chain. > Entering new LLMChain chain... Prompt after formatting: You are an expert fact checker. You have been hired by a major news organization to fact check a very important story. Here is a bullet point list of facts: """ ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-24 | Checked Assertions: """ - The Greenland Sea is an outlying portion of the Arctic Ocean located between Iceland, Norway, the Svalbard archipelago and Greenland. True - It has an area of 465,000 square miles. True - It is covered almost entirely by water, some of which is frozen in the form of gla... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-25 | be completely true. The output should have the same structure and formatting as the original summary. Summary: > Finished chain. > Entering new LLMChain chain... Prompt after formatting: Below are some assertions that have been fact checked and are labeled as true or false. ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-26 | of 465,000 square miles. True - It is covered almost entirely by water, some of which is frozen in the form of glaciers and icebergs. True - The sea is named after the country of Greenland. True - It is the Arctic Ocean's main outlet to the Atlantic. False - The Arctic Ocean's main outlet to the At... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-27 | The sea is named after the country of Greenland, and is the Arctic Ocean's main outlet to the Barents Sea. It is often frozen over so navigation is limited, and is considered part of the Arctic Ocean."from langchain.chains import LLMSummarizationCheckerChainfrom langchain.llms import OpenAIllm = OpenAI(temperature=0)ch... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-28 | - Birds are mammals """ For each fact, determine whether it is true or false about the subject. If you are unable to determine whether the fact is true or false, output "Undetermined". If the fact is false, explain why. > Finished chain. > Entering new LLMChain chain... Prompt ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-29 | assertions that have been fact checked and are labeled as true or false. If all of the assertions are true, return "True". If any of the assertions are false, return "False". Here are some examples: === Checked Assertions: """ - The sky is red: False - Water is made of lava: False - The... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-30 | eggs, however birds are not mammals, they are a class of their own. > Entering new SequentialChain chain... > Entering new LLMChain chain... Prompt after formatting: Given some text, extract a list of facts from the text. Format your output as a bulleted list. Text: """ ... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-31 | some assertions that have been fact checked and are labeled as true of false. If the answer is false, a suggestion is given for a correction. Checked Assertions: """ - Birds and mammals are both capable of laying eggs: False. Mammals give birth to live young, while birds lay eggs. - Birds are n... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
b9e9af00ccda-32 | a star: True """ Result: False === Checked Assertions: """ - The sky is blue: True - Water is wet: True - The sun is a star: True """ Result: True === Checked Assertions: """ - The sky is blue - True - Water is made of lava- False - The sun is a star - True "... | https://python.langchain.com/docs/modules/chains/additional/llm_summarization_checker |
e7ea722c5c40-0 | Graph DB QA chain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
e7ea722c5c40-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
e7ea722c5c40-2 | You can run a local docker container by running the executing the following script:docker run \ --name neo4j \ -p 7474:7474 -p 7687:7687 \ -d \ -e NEO4J_AUTH=neo4j/pleaseletmein \ -e NEO4J_PLUGINS=\[\"apoc\"\] \ neo4j:latestIf you are using the docker container, you need to wait a couple of second fo... | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
e7ea722c5c40-3 | [{'property': 'name', 'type': 'STRING'}], 'labels': 'Movie'}, {'properties': [{'property': 'name', 'type': 'STRING'}], 'labels': 'Actor'}] Relationship properties are the following: [] The relationships are the following: ['(:Actor)-[:ACTED_IN]->(:Movie)'] Querying... | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
e7ea722c5c40-4 | The default is 10.chain = GraphCypherQAChain.from_llm( ChatOpenAI(temperature=0), graph=graph, verbose=True, top_k=2)chain.run("Who played in Top Gun?") > Entering new GraphCypherQAChain chain... Generated Cypher: MATCH (a:Actor)-[:ACTED_IN]->(m:Movie {name: 'Top Gun'}) RETURN a.name Full Cont... | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
e7ea722c5c40-5 | [{'query': "MATCH (a:Actor)-[:ACTED_IN]->(m:Movie {name: 'Top Gun'})\nRETURN a.name"}, {'context': [{'a.name': 'Val Kilmer'}, {'a.name': 'Anthony Edwards'}, {'a.name': 'Meg Ryan'}, {'a.name': 'Tom Cruise'}]}] Final answer: Val Kilmer, Anthony Edwards, Meg Ryan, and Tom Cruise played in Top Gun.Return direct resultsâ... | https://python.langchain.com/docs/modules/chains/additional/graph_cypher_qa |
ca061add0d60-0 | Document QA | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-2 | import VectorstoreIndexCreatorwith open("../../state_of_the_union.txt") as f: state_of_the_union = f.read()text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)texts = text_splitter.split_text(state_of_the_union)embeddings = OpenAIEmbeddings()docsearch = Chroma.from_texts(texts, embeddings, metadat... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-3 | serve the country and thanked him for his service.'}Custom PromptsYou can also use your own prompts with this chain. In this example, we will respond in Italian.prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't t... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-4 | {'intermediate_steps': [' "Tonight, I’d like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court. Justice Breyer, thank you for your service."', ' A former top litigator in pri... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-5 | template=combine_prompt_template, input_variables=["summaries", "question"])chain = load_qa_chain(OpenAI(temperature=0), chain_type="map_reduce", return_map_steps=True, question_prompt=QUESTION_PROMPT, combine_prompt=COMBINE_PROMPT)chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'inter... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-6 | = "What did the president say about Justice Breyer"chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'output_text': '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty ... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-7 | for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equality Act and his commitment to protecting the rights of LGBTQ+ Americans. He also praised Justice Breyer for his role in helping to pass the Bipartisan Infrastr... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-8 | "Context information is below. \n" "---------------------\n" "{context_str}" "\n---------------------\n" "Given the context information and not prior knowledge, " "answer the question: {question}\nYour answer should be in Italian.\n")initial_qa_prompt = PromptTemplate( input_variables=["context_str", ... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-9 | di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche. Ha anche sottolineato l'importanza di avanzare la libertà e la giustizia attraverso la sicurezz... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-10 | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche. Ha anche sott... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-11 | {'answer': ' This document does not answer the question', 'score': '0'}, {'answer': ' This document does not answer the question', 'score': '0'}, {'answer': ' This document does not answer the question', 'score': '0'}]Custom PromptsYou can also use your own prompts with this chain. In this example, we will resp... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
ca061add0d60-12 | 'score': '100'}, {'answer': ' Il presidente non ha detto nulla sulla Giustizia Breyer.', 'score': '100'}, {'answer': ' Non so.', 'score': '0'}, {'answer': ' Non so.', 'score': '0'}], 'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese.'}Docu... | https://python.langchain.com/docs/modules/chains/additional/question_answering |
581464dcab1c-0 | Bash chain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
581464dcab1c-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
581464dcab1c-2 | echo "Hello World" ``` Code: ['echo "Hello World"'] Answer: Hello World > Finished chain. 'Hello World\n'Customize Prompt​You can also customize the prompt that is used. Here is an example prompting to avoid using the 'echo' utilityfrom langchain.prompts.prompt import PromptTemplatefrom langchain.c... | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
581464dcab1c-3 | Question: {question}"""PROMPT = PromptTemplate(
input_variables=["question"],
template=_PROMPT_TEMPLATE,
output_parser=BashOutputParser(), | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
581464dcab1c-4 | )```pythonbash_chain = LLMBashChain.from_llm(llm, prompt=PROMPT, verbose=True)text = "Please write a bash script that prints 'Hello World' to the console."bash_chain.run(text) > Entering new LLMBashChain chain... Please write a bash script that prints 'Hello World' to the console. ```bash printf... | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
581464dcab1c-5 | openapi.html llm_math.html pal.html llm_requests.html sqlite.html > Finished chain. 'api.html\t\t\tllm_summarization_checker.html\r\nconstitutional_chain.html\tmoderation.html\r\nllm_bash.html\t\t\topenai_openapi.yaml\r\nllm_checker.html\t\topenapi.html\r\nllm_math.html\t\t\tpal.html\r\nllm_... | https://python.langchain.com/docs/modules/chains/additional/llm_bash |
176414394179-0 | Hypothetical Document Embeddings | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/hyde |
176414394179-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/hyde |
176414394179-2 | own.from langchain.llms import OpenAIfrom langchain.embeddings import OpenAIEmbeddingsfrom langchain.chains import LLMChain, HypotheticalDocumentEmbedderfrom langchain.prompts import PromptTemplatebase_embeddings = OpenAIEmbeddings()llm = OpenAI()# Load with `web_search` promptembeddings = HypotheticalDocumentEmbedder.... | https://python.langchain.com/docs/modules/chains/additional/hyde |
176414394179-3 | llm_chain=llm_chain, base_embeddings=base_embeddings)result = embeddings.embed_query( "What did the president say about Ketanji Brown Jackson")Using HyDE​Now that we have HyDE, we can use it as we would any other embedding class! Here is using it to find similar passages in the state of the union example.from lang... | https://python.langchain.com/docs/modules/chains/additional/hyde |
176414394179-4 | you for your service. One of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court. And I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nation’s top legal minds, who wi... | https://python.langchain.com/docs/modules/chains/additional/hyde |
a54841183ae0-0 | Retrieval QA using OpenAI functions | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-2 | encoding="utf-8")documents = loader.load()text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)texts = text_splitter.split_documents(documents)for i, text in enumerate(texts): text.metadata["source"] = f"{i}-pl"embeddings = OpenAIEmbeddings()docsearch = Chroma.from_documents(texts, embeddings)from ... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-3 | determination to protect American interests.",\n "sources": ["0-pl", "4-pl", "5-pl", "6-pl"]\n}'Using Pydantic​If we want to, we can set the chain to return in Pydantic. Note that if downstream chains consume the output of this chain - including memory - they will generally expect it to be in string format, so you s... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-4 | langchain.memory import ConversationBufferMemoryfrom langchain.chains import LLMChainmemory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)_template = """Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, in its original la... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-5 | minds who will continue Justice Breyer\'s legacy of excellence.",\n "sources": ["31-pl"]\n}'}query = "what did he say about her predecessor?"result = qa({"question": query})result {'question': 'what did he say about her predecessor?', 'chat_history': [HumanMessage(content='What did the president say about Ketan... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-6 | example we can add a countries_referenced parameter to our schema and describe what we want this parameter to mean, and that'll cause the OpenAI output to include a description of a speaker in the response.In addition to the previous example, we can also add a custom prompt to the chain. This will allow you to add addi... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-7 | Return all of the countries mentioned in the sources in uppercase characters." ),]chain_prompt = ChatPromptTemplate(messages=prompt_messages)qa_chain_pydantic = create_qa_with_structure_chain( llm, CustomResponseSchema, output_parser="pydantic", prompt=chain_prompt)final_qa_chain_pydantic = StuffDocumentsChain( ... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
a54841183ae0-8 | from the international financial system, and preventing Russia's central bank from defending the Russian Ruble. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs.", countries_referenced=['AMERICA', 'RUSSIA', 'UKRAINE'], sources=['4-pl', '5-pl', '2-pl', '3-pl'])Pr... | https://python.langchain.com/docs/modules/chains/additional/openai_functions_retrieval_qa |
686d651dfe1e-0 | ArangoDB QA chain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-2 | Databaseimport jsonfrom arango import ArangoClientfrom adb_cloud_connector import get_temp_credentialscon = get_temp_credentials()db = ArangoClient(hosts=con["url"]).db( con["dbName"], con["username"], con["password"], verify=True)print(json.dumps(con, indent=2)) Log: requesting new credentials... Succcess: ne... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-3 | ["Characters"], }, ],)documents = [ { "_key": "NedStark", "name": "Ned", "surname": "Stark", "alive": True, "age": 41, "gender": "male", }, { "_key": "CatelynStark", "name": "Catelyn", "surname": "Stark", "alive": False, "ag... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-4 | "male", },]edges = [ {"_to": "Characters/NedStark", "_from": "Characters/AryaStark"}, {"_to": "Characters/NedStark", "_from": "Characters/BranStark"}, {"_to": "Characters/CatelynStark", "_from": "Characters/AryaStark"}, {"_to": "Characters/CatelynStark", "_from": "Characters/BranStark"},]db.collection("C... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-5 | "graph_name": "GameOfThrones", "edge_definitions": [ { "edge_collection": "ChildOf", "from_vertex_collections": [ "Characters" ], "to_vertex_collections": [ ... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-6 | "edge_properties": [ { "name": "_key", "type": "str" }, { "name": "_id", "type": "str" }, { "name": "_from", ... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-7 | "type": "str" }, { "name": "_rev", "type": "str" } ], "example_edge": { "_key": "266218884025", "_id": "ChildOf/266218884025", "_f... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-8 | "collection_type": "document", "document_properties": [ { "name": "_key", "type": "str" }, { "name": "_id", "type": "str" }, ... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-9 | "type": "str" }, { "name": "surname", "type": "str" }, { "name": "alive", "type": "bool" }, { ... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-10 | "name": "gender", "type": "str" } ], "example_document": { "_key": "NedStark", "_id": "Characters/NedStark", "_rev": "_gVPKGPi---", "name": "Ned", "surna... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-11 | = "your-key-here"from langchain.chat_models import ChatOpenAIfrom langchain.chains import ArangoGraphQAChainchain = ArangoGraphQAChain.from_llm( ChatOpenAI(temperature=0), graph=graph, verbose=True)chain.run("Is Ned Stark alive?") > Entering new ArangoGraphQAChain chain... AQL Query (1): WITH Charac... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-12 | FILTER p.vertices[-1]._key == 'NedStark' RETURN p AQL Result: [{'vertices': [{'_key': 'AryaStark', '_id': 'Characters/AryaStark', '_rev': '_gVPKGPi--B', 'name': 'Arya', 'surname': 'Stark', 'alive': True, 'age': 11, 'gender': 'female'}, {'_key': 'NedStark', '_id': 'Characters/NedStark', '_rev': '_gVPKGP... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-13 | RETURN e AQL Result: [{'_key': '266218884027', '_id': 'ChildOf/266218884027', '_from': 'Characters/AryaStark', '_to': 'Characters/CatelynStark', '_rev': '_gVPKGSu---'}] > Finished chain. 'Yes, Arya Stark has a dead parent. The parent is Catelyn Stark.'Chain Modifiers​You can alter the values of th... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
686d651dfe1e-14 | ArangoGraphQAChain chain... AQL Query (1): RETURN DOCUMENT('Characters/NedStark').alive AQL Result: [True] > Finished chain. 'Yes, according to the information in the database, Ned Stark is alive.'chain.run("Is Bran Stark the child of Ned Stark?") > Entering new ArangoGraphQAChain c... | https://python.langchain.com/docs/modules/chains/additional/graph_arangodb_qa |
5d0b647175fa-0 | Self-checking chain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/llm_checker |
5d0b647175fa-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/llm_checker |
5d0b647175fa-2 | Finished chain. ' No mammal lays the biggest eggs. The Elephant Bird, which was a species of giant bird, laid the largest eggs of any bird.'PreviousBash chainNextMath chainCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc. | https://python.langchain.com/docs/modules/chains/additional/llm_checker |
7912408f4eb4-0 | LLM Symbolic Math | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/llm_symbolic_math |
7912408f4eb4-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/llm_symbolic_math |
7912408f4eb4-2 | "What is the integral of exp(x)*sin(x) + exp(x)*cos(x) with respect to x?") 'Answer: exp(x)*sin(x)'Solve linear and differential equations​llm_symbolic_math.run('Solve the differential equation y" - y = e^t') 'Answer: Eq(y(t), C2*exp(-t) + (C1 + t/2)*exp(t))'llm_symbolic_math.run("What are the solutions to this... | https://python.langchain.com/docs/modules/chains/additional/llm_symbolic_math |
c340ec6e138d-0 | KuzuQAChain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
c340ec6e138d-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
c340ec6e138d-2 | STRING, birthDate STRING, PRIMARY KEY(name))")conn.execute("CREATE REL TABLE ActedIn (FROM Person TO Movie)") <kuzu.query_result.QueryResult at 0x1066ff410>Then we can insert some data.conn.execute("CREATE (:Person {name: 'Al Pacino', birthDate: '1940-04-25'})")conn.execute("CREATE (:Person {name: 'Robert De Niro', ... | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
c340ec6e138d-3 | <kuzu.query_result.QueryResult at 0x107016210>Creating KuzuQAChain​We can now create the KuzuGraph and KuzuQAChain. To create the KuzuGraph we simply need to pass the database object to the KuzuGraph constructor.from langchain.chat_models import ChatOpenAIfrom langchain.graphs import KuzuGraphfrom langchain.chains im... | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
c340ec6e138d-4 | 'Al Pacino and Robert De Niro both played in The Godfather: Part II.'chain.run("Robert De Niro played in which movies?") > Entering new chain... Generated Cypher: MATCH (p:Person {name: 'Robert De Niro'})-[:ActedIn]->(m:Movie) RETURN m.name Full Context: [{'m.name': 'The Godfather: Part II'}]... | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
c340ec6e138d-5 | Context: [{'p.name': 'Al Pacino'}] > Finished chain. 'The oldest actor who played in The Godfather: Part II is Al Pacino.'PreviousHugeGraph QA ChainNextNebulaGraphQAChainCreating KuzuQAChainRefresh graph schema informationQuerying the graphCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright Â... | https://python.langchain.com/docs/modules/chains/additional/graph_kuzu_qa |
16eb4cec3950-0 | GraphSparqlQAChain | 🦜�🔗 Langchain | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-1 | Skip to main content🦜�🔗 LangChainDocsUse casesIntegrationsAPILangSmithJS/TS DocsCTRLKGet startedIntroductionInstallationQuickstartModulesModel I/​OData connectionChainsHow toFoundationalDocumentsPopularAdditionalAnalyze DocumentSelf-critique chain with constitutional AICausal program-aided language (CPAL) cha... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-2 | Disclaimer: To date, SPARQL query generation via LLMs is still a bit unstable. Be especially careful with UPDATE queries, which alter the graph.There are several sources you can run queries against, including files on the web, files you have available locally, SPARQL endpoints, e.g., Wikidata, and triple stores.from la... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-3 | (seeAlso, None), <http://purl.org/dc/elements/1.1/title> (title, None), <http://xmlns.com/foaf/0.1/mbox_sha1sum> (mbox_sha1sum, None), <http://xmlns.com/foaf/0.1/maker> (maker, None), <http://www.w3.org/ns/solid/terms#oidcIssuer> (oidcIssuer, None), <http://www.w3.org/2000/10/swap/pim/contact#publicHomePage> (publicHom... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-4 | (locality, None), <http://xmlns.com/foaf/0.1/nick> (nick, None), <http://xmlns.com/foaf/0.1/homepage> (homepage, None), <http://creativecommons.org/ns#license> (license, None), <http://xmlns.com/foaf/0.1/givenname> (givenname, None), <http://www.w3.org/2006/vcard/ns#street-address> (street-address, None), <http://www.w... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-5 | None), <http://xmlns.com/foaf/0.1/family_name> (family_name, None), <http://xmlns.com/foaf/0.1/account> (account, None), <http://xmlns.com/foaf/0.1/workplaceHomepage> (workplaceHomepage, None), <http://purl.org/dc/terms/title> (title, None), <http://www.w3.org/ns/solid/terms#publicTypeIndex> (publicTypeIndex, None), <h... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-6 | (inbox, None), <http://www.w3.org/ns/solid/terms#editableProfile> (editableProfile, None), <http://www.w3.org/2000/10/swap/pim/contact#postalCode> (postalCode, None), <http://xmlns.com/foaf/0.1/weblog> (weblog, None), <http://www.w3.org/ns/auth/cert#exponent> (exponent, None), <http://rdfs.org/sioc/ns#avatar> (avatar, ... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
16eb4cec3950-7 | "Save that the person with the name 'Timothy Berners-Lee' has a work homepage at 'http://www.w3.org/foo/bar/'") > Entering new GraphSparqlQAChain chain... Identified intent: UPDATE Generated SPARQL: PREFIX foaf: <http://xmlns.com/foaf/0.1/> INSERT { ?person foaf:workplaceHomepage <http:... | https://python.langchain.com/docs/modules/chains/additional/graph_sparql_qa |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.