id
stringlengths
14
16
text
stringlengths
31
3.14k
source
stringlengths
58
124
ec70b958996e-23
> Entering new APIResponderChain chain... Prompt after formatting: You are a helpful AI assistant trained to answer user queries from API responses. You attempted to call an API, which resulted in:
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-24
API_RESPONSE: {"explanation":"<what-to-say language=\"Hindi\" context=\"None\">\nऔर चाय लाओ। (Aur chai lao.) \n</what-to-say>\n\n<alternatives context=\"None\">\n1. \"चाय थोड़ी ज्यादा मिल सकती है?\" *(Chai thodi zyada mil sakti hai? - Polite, asking if more tea is available)*\n2. \"मुझे महसूस हो रहा है कि मुझे कुछ अन्य...
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-25
main aur cups chai lekar aaun? - Sir, should I get more tea cups?)\nRahul: हां,बिल्कुल। और चाय की मात्रा में भी थोड़ा सा इजाफा करना। (Haan,bilkul. Aur chai ki matra mein bhi thoda sa eejafa karna. - Yes, please. And add a little extra in the quantity of tea as well.)\n</example-convo>\n\n*[Report an issue or leave feed...
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-26
USER_COMMENT: "How would ask for more tea in Delhi?" If the API_RESPONSE can answer the USER_COMMENT respond with the following markdown json block: Response: ```json {"response": "Concise response to USER_COMMENT based on API_RESPONSE."} ``` Otherwise respond with the following markdown json block: Response Error: ```...
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-27
'{"explanation":"<what-to-say language=\\"Hindi\\" context=\\"None\\">\\nऔर चाय लाओ। (Aur chai lao.) \\n</what-to-say>\\n\\n<alternatives context=\\"None\\">\\n1. \\"चाय थोड़ी ज्यादा मिल सकती है?\\" *(Chai thodi zyada mil sakti hai? - Polite, asking if more tea is available)*\\n2. \\"मुझे महसूस हो रहा है कि मुझे कुछ अन...
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-28
हां,बिल्कुल। और चाय की मात्रा में भी थोड़ा सा इजाफा करना। (Haan,bilkul. Aur chai ki matra mein bhi thoda sa eejafa karna. - Yes, please. And add a little extra in the quantity of tea as well.)\\n</example-convo>\\n\\n*[Report an issue or leave feedback](https://speak.com/chatgpt?rid=d4mcapbkopo164pqpbk321oc})*","extra_...
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
ec70b958996e-29
previous Moderation next PAL Contents Load the spec Select the Operation Construct the chain Return raw response Example POST message By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Apr 26, 2023.
/content/https://python.langchain.com/en/latest/modules/chains/examples/openapi.html
432276d2e816-0
.ipynb .pdf PAL Contents Math Prompt Colored Objects Intermediate Steps PAL# Implements Program-Aided Language Models, as in https://arxiv.org/pdf/2211.10435.pdf. from langchain.chains import PALChain from langchain import OpenAI llm = OpenAI(model_name='code-davinci-002', temperature=0, max_tokens=512) Math Prompt# ...
/content/https://python.langchain.com/en/latest/modules/chains/examples/pal.html
432276d2e816-1
question = "On the desk, you see two blue booklets, two purple booklets, and two yellow pairs of sunglasses. If I remove all the pairs of sunglasses from the desk, how many purple items remain on it?" pal_chain.run(question) > Entering new PALChain chain... # Put objects into a list to record ordering objects = [] obje...
/content/https://python.langchain.com/en/latest/modules/chains/examples/pal.html
432276d2e816-2
# Put objects into a list to record ordering objects = [] objects += [('booklet', 'blue')] * 2 objects += [('booklet', 'purple')] * 2 objects += [('sunglasses', 'yellow')] * 2 # Remove all pairs of sunglasses objects = [object for object in objects if object[0] != 'sunglasses'] # Count number of purple objects num_purp...
/content/https://python.langchain.com/en/latest/modules/chains/examples/pal.html
c3d9055b594c-0
.ipynb .pdf Graph QA Contents Create the graph Querying the graph Save the graph Graph QA# This notebook goes over how to do question answering over a graph data structure. Create the graph# In this section, we construct an example graph. At the moment, this works best for small pieces of text. from langchain.indexes...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/graph_qa.html
c3d9055b594c-1
('Intel', 'state-of-the-art factories', 'is building'), ('Intel', '10,000 new good-paying jobs', 'is creating'), ('Intel', 'Silicon Valley', 'is helping build'), ('Field of dreams', "America's future will be built", 'is the ground on which')] Querying the graph# We can now use the graph QA chain to ask question ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/graph_qa.html
c3d9055b594c-2
('Intel', 'state-of-the-art factories', 'is building'), ('Intel', '10,000 new good-paying jobs', 'is creating'), ('Intel', 'Silicon Valley', 'is helping build'), ('Field of dreams', "America's future will be built", 'is the ground on which')] previous Chat Over Documents with Chat History next Hypothetical Docum...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/graph_qa.html
9f6f89335c6b-0
.ipynb .pdf Question Answering with Sources Contents Prepare Data Quickstart The stuff Chain The map_reduce Chain The refine Chain The map-rerank Chain Question Answering with Sources# This notebook walks through how to use LangChain for question answering with sources over a list of documents. It covers four differe...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-1
embeddings = OpenAIEmbeddings() docsearch = Chroma.from_texts(texts, embeddings, metadatas=[{"source": str(i)} for i in range(len(texts))]) Running Chroma using direct local API. Using DuckDB in-memory for database. Data will be transient. query = "What did the president say about Justice Breyer" docs = docsearch.simil...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-2
{'output_text': ' The president thanked Justice Breyer for his service.\nSOURCES: 30-pl'} Custom Prompts You can also use your own prompts with this chain. In this example, we will respond in Italian. template = """Given the following extracted parts of a long document and a question, create a final answer with referen...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-3
query = "What did the president say about Justice Breyer" chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'output_text': ' The president thanked Justice Breyer for his service.\nSOURCES: 30-pl'} Intermediate Steps We can also return the intermediate steps for map_reduce chains, should we ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-4
Return any relevant text in Italian. {context} Question: {question} Relevant text, if any, in Italian:""" QUESTION_PROMPT = PromptTemplate( template=question_prompt_template, input_variables=["context", "question"] ) combine_prompt_template = """Given the following extracted parts of a long document and a question,...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-5
{'intermediate_steps': ["\nStasera vorrei onorare qualcuno che ha dedicato la sua vita a servire questo paese: il giustizia Stephen Breyer - un veterano dell'esercito, uno studioso costituzionale e un giustizia in uscita della Corte Suprema degli Stati Uniti. Giustizia Breyer, grazie per il tuo servizio.", ' Non pert...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-6
{'output_text': "\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked him for his service and praised his career as a top litigator in private practice, a former federal public defender...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-7
chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type="refine", return_intermediate_steps=True) chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'intermediate_steps': ['\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and tha...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-8
'\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal public defender, and...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-9
'\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal public defender, and...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-10
'output_text': '\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal publi...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-11
"------------\n" "{context_str}\n" "------------\n" "Given the new context, refine the original answer to better " "answer the question (in Italian)" "If you do update it, please update the sources as well. " "If the context isn't useful, return the original answer." ) refine_prompt = PromptTemp...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-12
"\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-13
"\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-14
chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type="map_rerank", metadata_keys=['source'], return_intermediate_steps=True) query = "What did the president say about Justice Breyer" result = chain({"input_documents": docs, "question": query}, return_only_outputs=True) result["output_text"] ' The Presid...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-15
output_keys=["answer", "score"], ) prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. In addition to giving an answer, also return a score of how fully it answered the user's question. Th...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
9f6f89335c6b-16
'score': '100'}, {'answer': ' Non so.', 'score': '0'}, {'answer': ' Il presidente non ha detto nulla sulla giustizia Breyer.', 'score': '100'}], 'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese e ha onorato la sua carriera.'} previous Hypothetical Documen...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html
95b907f9b305-0
.ipynb .pdf Question Answering Contents Prepare Data Quickstart The stuff Chain The map_reduce Chain The refine Chain The map-rerank Chain Question Answering# This notebook walks through how to use LangChain for question answering over a list of documents. It covers four different types of chains: stuff, map_reduce, ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-1
Running Chroma using direct local API. Using DuckDB in-memory for database. Data will be transient. query = "What did the president say about Justice Breyer" docs = docsearch.get_relevant_documents(query) from langchain.chains.question_answering import load_qa_chain from langchain.llms import OpenAI Quickstart# If you ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-2
prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. {context} Question: {question} Answer in Italian:""" PROMPT = PromptTemplate( template=prompt_template, input_variables=["context", ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-3
chain = load_qa_chain(OpenAI(temperature=0), chain_type="map_reduce", return_map_steps=True) chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'intermediate_steps': [' "Tonight, I’d like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army vetera...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-4
QUESTION_PROMPT = PromptTemplate( template=question_prompt_template, input_variables=["context", "question"] ) combine_prompt_template = """Given the following extracted parts of a long document and a question, create a final answer italian. If you don't know the answer, just say that you don't know. Don't try to ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-5
" Non c'è testo pertinente."], 'output_text': ' Non ha detto nulla riguardo a Justice Breyer.'} Batch Size When using the map_reduce chain, one thing to keep in mind is the batch size you are using during the map step. If this is too high, it could cause rate limiting errors. You can control this by setting the batch ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-6
Intermediate Steps We can also return the intermediate steps for refine chains, should we want to inspect them. This is done with the return_refine_steps variable. chain = load_qa_chain(OpenAI(temperature=0), chain_type="refine", return_refine_steps=True) chain({"input_documents": docs, "question": query}, return_only_...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-7
'output_text': '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equality Act and his commitment to protecting the rights of LGBTQ+ Americans. He also ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-8
"Given the context information and not prior knowledge, " "answer the question: {question}\nYour answer should be in Italian.\n" ) initial_qa_prompt = PromptTemplate( input_variables=["context_str", "question"], template=initial_qa_template ) chain = load_qa_chain(OpenAI(temperature=0), chain_type="refine", ret...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-9
"\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche. Ha anche sottol...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-10
'output_text': "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-11
'score': '100'}, {'answer': ' This document does not answer the question', 'score': '0'}, {'answer': ' This document does not answer the question', 'score': '0'}, {'answer': ' This document does not answer the question', 'score': '0'}] Custom Prompts You can also use your own prompts with this chain. In this example...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
95b907f9b305-12
output_parser=output_parser, ) chain = load_qa_chain(OpenAI(temperature=0), chain_type="map_rerank", return_intermediate_steps=True, prompt=PROMPT) query = "What did the president say about Justice Breyer" chain({"input_documents": docs, "question": query}, return_only_outputs=True) {'intermediate_steps': [{'answer': '...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html
74fd853d76ab-0
.ipynb .pdf Vector DB Text Generation Contents Prepare Data Set Up Vector DB Set Up LLM Chain with Custom Prompt Generate Text Vector DB Text Generation# This notebook walks through how to use LangChain for text generation over a vector index. This is useful if we want to generate text that is able to draw from a lar...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-1
.strip() ) repo_path = pathlib.Path(d) markdown_files = list(repo_path.glob("*/*.md")) + list( repo_path.glob("*/*.mdx") ) for markdown_file in markdown_files: with open(markdown_file, "r") as f: relative_path = markdown_file.relative_to(re...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-2
Set Up LLM Chain with Custom Prompt# Next, let’s set up a simple LLM chain but give it a custom prompt for blog post generation. Note that the custom prompt is parameterized and takes two inputs: context, which will be the documents fetched from the vector search, and topic, which is given by the user. from langchain.c...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-3
[{'text': '\n\nEnvironment variables are a great way to store and access sensitive information in your Deno applications. Deno offers built-in support for environment variables with `Deno.env`, and you can also use a `.env` file to store and access environment variables.\n\nUsing `Deno.env` is simple. It has getter and...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-4
code.\n\nIn Deno, environment variables can be set in a few different ways. The most common way is to use the `VAR=value` syntax. This will set the environment variable `VAR` to the value `value`. This can be used to set any number of environment variables before running a command. For example, if we wanted to set the ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-5
to grant the Deno process access to environment variables. This can be done by passing the `--allow-env` flag to the `deno run` command. You can also specify which environment variables you want to grant access to, like this:\n\n```shell\n# Allow access to only the HOME env var\ndeno run --allow-env=HOME env.js\n```\n\...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-6
is a read-only object, meaning that you cannot directly modify the environment variables. Instead, you must use the `Deno.env.set()` function to set environment variables. This function takes two arguments: the name of the environment variable and the value to set it to. For example, if you wanted to set the `FOO` envi...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
74fd853d76ab-7
previous Retrieval Question Answering with Sources next API Chains Contents Prepare Data Set Up Vector DB Set Up LLM Chain with Custom Prompt Generate Text By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Apr 26, 2023.
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html
5a470061f072-0
.ipynb .pdf Retrieval Question Answering with Sources Contents Chain Type Retrieval Question Answering with Sources# This notebook goes over how to do question-answering with sources over an Index. It does this by using the RetrievalQAWithSourcesChain, which does the lookup of the documents from an Index. from langch...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html
5a470061f072-1
from langchain import OpenAI chain = RetrievalQAWithSourcesChain.from_chain_type(OpenAI(temperature=0), chain_type="stuff", retriever=docsearch.as_retriever()) chain({"question": "What did the president say about Justice Breyer"}, return_only_outputs=True) {'answer': ' The president honored Justice Breyer for his servi...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html
5a470061f072-2
'sources': '31-pl'} The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in this notebook) and then pass that directly to the the RetrievalQAWithS...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html
f32bc5d16be6-0
.ipynb .pdf Summarization Contents Prepare Data Quickstart The stuff Chain The map_reduce Chain The refine Chain Summarization# This notebook walks through how to use LangChain for summarization over a list of documents. It covers three different chain types: stuff, map_reduce, and refine. For a more in depth explana...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-1
chain.run(docs) ' In response to Russian aggression in Ukraine, the United States and its allies are taking action to hold Putin accountable, including economic sanctions, asset seizures, and military assistance. The US is also providing economic and humanitarian aid to Ukraine, and has passed the American Rescue Plan ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-2
chain = load_summarize_chain(llm, chain_type="stuff", prompt=PROMPT) chain.run(docs) "\n\nIn questa serata, il Presidente degli Stati Uniti ha annunciato una serie di misure per affrontare la crisi in Ucraina, causata dall'aggressione di Putin. Ha anche annunciato l'invio di aiuti economici, militari e umanitari all'Uc...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-3
chain = load_summarize_chain(OpenAI(temperature=0), chain_type="map_reduce", return_intermediate_steps=True) chain({"input_documents": docs}, return_only_outputs=True) {'map_steps': [" In response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sancti...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-4
'output_text': " In response to Russia's aggression in Ukraine, the United States and its allies have imposed economic sanctions and are taking other measures to hold Putin accountable. The US is also providing economic and military assistance to Ukraine, protecting NATO countries, and passing legislation to help strug...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-5
{'intermediate_steps': ["\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Gli Stati Uniti e i loro alleati stanno ora imponendo sanzion...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-6
"\n\nIl Presidente Biden ha lottato per passare l'American Rescue Plan per aiutare le persone che soffrivano a causa della pandemia. Il piano ha fornito sollievo economico immediato a milioni di americani, ha aiutato a mettere cibo sulla loro tavola, a mantenere un tetto sopra le loro teste e a ridurre il costo dell'as...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-7
chain.run(docs) "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotte...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-8
chain({"input_documents": docs}, return_only_outputs=True) {'refine_steps': [" In response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-9
"\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-10
"\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-11
'output_text': "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-12
{text} CONCISE SUMMARY IN ITALIAN:""" PROMPT = PromptTemplate(template=prompt_template, input_variables=["text"]) refine_template = ( "Your job is to produce a final summary\n" "We have provided an existing summary up to a certain point: {existing_answer}\n" "We have the opportunity to refine the existing s...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-13
{'intermediate_steps': ["\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economic...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-14
"\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagliando l'accesso ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
f32bc5d16be6-15
Contents Prepare Data Quickstart The stuff Chain The map_reduce Chain The refine Chain By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Apr 26, 2023.
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html
55a9fd051f0d-0
.ipynb .pdf Analyze Document Contents Summarize Question Answering Analyze Document# The AnalyzeDocumentChain is more of an end to chain. This chain takes in a single document, splits it up, and then runs it through a CombineDocumentsChain. This can be used as more of an end-to-end chain. with open("../../state_of_th...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/analyze_document.html
55a9fd051f0d-1
summarize_document_chain.run(state_of_the_union) " In this speech, President Biden addresses the American people and the world, discussing the recent aggression of Russia's Vladimir Putin in Ukraine and the US response. He outlines economic sanctions and other measures taken to hold Putin accountable, and announces the...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/analyze_document.html
3d743cc15bfd-0
.ipynb .pdf Chat Over Documents with Chat History Contents Pass in chat history Return Source Documents ConversationalRetrievalChain with search_distance ConversationalRetrievalChain with map_reduce ConversationalRetrievalChain with Question Answering with sources ConversationalRetrievalChain with streaming to stdout...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-1
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0) documents = text_splitter.split_documents(documents) embeddings = OpenAIEmbeddings() vectorstore = Chroma.from_documents(documents, embeddings) Using embedded DuckDB without persistence: data will be transient We can now create a memory object, whi...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-2
result['answer'] ' Ketanji Brown Jackson succeeded Justice Stephen Breyer on the United States Supreme Court.' Pass in chat history# In the above example, we used a Memory object to track chat history. We can also just pass it in explicitly. In order to do this, we need to initialize a chain without any memory object. ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-3
Return Source Documents# You can also easily return source documents from the ConversationalRetrievalChain. This is useful for when you want to inspect what documents were returned. qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(), return_source_documents=True) chat_history =...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-4
ConversationalRetrievalChain with search_distance# If you are using a vector store that supports filtering by search distance, you can add a threshold value parameter. vectordbkwargs = {"search_distance": 0.9} qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(), return_source_do...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-5
query = "What did the president say about Ketanji Brown Jackson" result = chain({"question": query, "chat_history": chat_history}) result['answer'] " The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-6
result['answer'] " The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, from a family of public school educators and police officers, a consensus builder, and has received a broad range of support from the Fra...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-7
doc_chain = load_qa_chain(streaming_llm, chain_type="stuff", prompt=QA_PROMPT) qa = ConversationalRetrievalChain( retriever=vectorstore.as_retriever(), combine_docs_chain=doc_chain, question_generator=question_generator) chat_history = [] query = "What did the president say about Ketanji Brown Jackson" result = qa(...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
3d743cc15bfd-8
return "\n".join(res) qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(), get_chat_history=get_chat_history) chat_history = [] query = "What did the president say about Ketanji Brown Jackson" result = qa({"question": query, "chat_history": chat_history}) result['answer'] " The ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html
b6da74138738-0
.ipynb .pdf Retrieval Question/Answering Contents Chain Type Custom Prompts Return Source Documents Retrieval Question/Answering# This example showcases question answering over an index. from langchain.embeddings.openai import OpenAIEmbeddings from langchain.vectorstores import Chroma from langchain.text_splitter imp...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-1
qa.run(query) " The president said that she is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad range of support, from...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-2
The above way allows you to really simply change the chain_type, but it does provide a ton of flexibility over parameters to that chain type. If you want to control those parameters, you can load the chain directly (as you did in this notebook) and then pass that directly to the the RetrievalQA chain with the combine_d...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-3
Answer in Italian:""" PROMPT = PromptTemplate( template=prompt_template, input_variables=["context", "question"] ) chain_type_kwargs = {"prompt": PROMPT} qa = RetrievalQA.from_chain_type(llm=OpenAI(), chain_type="stuff", retriever=docsearch.as_retriever(), chain_type_kwargs=chain_type_kwargs) query = "What did the ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-4
result["result"] " The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice and a former federal public defender from a family of public school educators and police officers, and that she has received a broad range of support from the Fraternal Ord...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-5
Document(page_content='A former top litigator in private practice. A former federal public defender. And from a family of public school educators and police officers. A consensus builder. Since she’s been nominated, she’s received a broad range of support—from the Fraternal Order of Police to former judges appointed by...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-6
Document(page_content='And for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families is wrong. \n\nAs I said last year, especially to our younger transgender Americans, I will always have your back as your President...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
b6da74138738-7
Document(page_content='Tonight, I’m announcing a crackdown on these companies overcharging American businesses and consumers. \n\nAnd as Wall Street firms take over more nursing homes, quality in those homes has gone down and costs have gone up. \n\nThat ends on my watch. \n\nMedicare is going to set higher standards ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html
e4b19ca000c0-0
.ipynb .pdf Hypothetical Document Embeddings Contents Multiple generations Using our own prompts Using HyDE Hypothetical Document Embeddings# This notebook goes over how to use Hypothetical Document Embeddings (HyDE), as described in this paper. At a high level, HyDE is an embedding technique that takes queries, gene...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html
e4b19ca000c0-1
multi_llm = OpenAI(n=4, best_of=4) embeddings = HypotheticalDocumentEmbedder.from_llm(multi_llm, base_embeddings, "web_search") result = embeddings.embed_query("Where is the Taj Mahal?") Using our own prompts# Besides using preconfigured prompts, we can also easily construct our own prompts and use those in the LLMChai...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html
e4b19ca000c0-2
from langchain.vectorstores import Chroma with open("../../state_of_the_union.txt") as f: state_of_the_union = f.read() text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0) texts = text_splitter.split_text(state_of_the_union) docsearch = Chroma.from_texts(texts, embeddings) query = "What did the ...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html
e4b19ca000c0-3
And I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nation’s top legal minds, who will continue Justice Breyer’s legacy of excellence. previous Graph QA next Question Answering with Sources Contents Multiple generations Using our own prompts Using HyDE By Har...
/content/https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html