parent_paper_title stringclasses 63
values | parent_paper_arxiv_id stringclasses 63
values | citation_shorthand stringlengths 2 56 | raw_citation_text stringlengths 9 63 | cited_paper_title stringlengths 5 161 | cited_paper_arxiv_link stringlengths 32 37 ⌀ | cited_paper_abstract stringlengths 406 1.92k ⌀ | has_metadata bool 1
class | is_arxiv_paper bool 2
classes | bib_paper_authors stringlengths 2 2.44k ⌀ | bib_paper_year float64 1.97k 2.03k ⌀ | bib_paper_month stringclasses 16
values | bib_paper_url stringlengths 20 116 ⌀ | bib_paper_doi stringclasses 269
values | bib_paper_journal stringlengths 3 148 ⌀ | original_title stringlengths 5 161 | search_res_title stringlengths 4 122 | search_res_url stringlengths 22 267 | search_res_content stringlengths 19 1.92k |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zengScalableEffectiveGenerative2023b | \cite{zengScalableEffectiveGenerative2023b} | Scalable and Effective Generative Information Retrieval | http://arxiv.org/abs/2311.09134v1 | Recent research has shown that transformer networks can be used as
differentiable search indexes by representing each document as a sequences of
document ID tokens. These generative retrieval models cast the retrieval
problem to a document ID generation problem for each given query. Despite their
elegant design, existi... | true | true | Hansi Zeng and Chen Luo and Bowen Jin and Sheikh Muhammad Sarwar and Tianxin Wei and Hamed Zamani | null | null | https://doi.org/10.1145/3589334.3645477 | 10.1145/3589334.3645477 | null | Scalable and Effective Generative Information Retrieval | Scalable and Effective Generative Information Retrieval | http://arxiv.org/pdf/2311.09134v1 | Recent research has shown that transformer networks can be used as
differentiable search indexes by representing each document as a sequences of
document ID tokens. These generative retrieval models cast the retrieval
problem to a document ID generation problem for each given query. Despite their
elegant design, existi... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | askariFewshotIndexing2024 | \cite{askariFewshotIndexing2024} | Generative Retrieval with Few-shot Indexing | http://arxiv.org/abs/2408.02152v1 | Existing generative retrieval (GR) approaches rely on training-based
indexing, i.e., fine-tuning a model to memorise the associations between a
query and the document identifier (docid) of a relevant document.
Training-based indexing has three limitations: high training overhead,
under-utilization of the pre-trained kn... | true | true | Arian Askari and Chuan Meng and Mohammad Aliannejadi and Zhaochun Ren and Evangelos Kanoulas and Suzan Verberne | null | null | https://doi.org/10.48550/arXiv.2408.02152 | 10.48550/ARXIV.2408.02152 | CoRR | Generative Retrieval with Few-shot Indexing | (PDF) Generative Retrieval with Few-shot Indexing - ResearchGate | https://www.researchgate.net/publication/382884626_Generative_Retrieval_with_Few-shot_Indexing | It has a novel few-shot indexing process, where we prompt an LLM to generate docids for all documents in a corpus, ultimately creating a docid |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | cont-learning-gr2023cikm | \cite{cont-learning-gr2023cikm} | Continual Learning for Generative Retrieval over Dynamic Corpora | http://arxiv.org/abs/2308.14968v1 | Generative retrieval (GR) directly predicts the identifiers of relevant
documents (i.e., docids) based on a parametric model. It has achieved solid
performance on many ad-hoc retrieval tasks. So far, these tasks have assumed a
static document collection. In many practical scenarios, however, document
collections are dy... | true | true | Chen, Jiangui and Zhang, Ruqing and Guo, Jiafeng and de Rijke, Maarten and Chen, Wei and Fan, Yixing and Cheng, Xueqi | null | null | https://doi.org/10.1145/3583780.3614821 | 10.1145/3583780.3614821 | null | Continual Learning for Generative Retrieval over Dynamic Corpora | Continual Learning for Generative Retrieval over Dynamic Corpora | http://arxiv.org/pdf/2308.14968v1 | Generative retrieval (GR) directly predicts the identifiers of relevant
documents (i.e., docids) based on a parametric model. It has achieved solid
performance on many ad-hoc retrieval tasks. So far, these tasks have assumed a
static document collection. In many practical scenarios, however, document
collections are dy... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | liu2024robustnessgenerative | \cite{liu2024robustnessgenerative} | On the Robustness of Generative Information Retrieval Models | http://arxiv.org/abs/2412.18768v1 | Generative information retrieval methods retrieve documents by directly
generating their identifiers. Much effort has been devoted to developing
effective generative IR models. Less attention has been paid to the robustness
of these models. It is critical to assess the out-of-distribution (OOD)
generalization of genera... | true | true | Yu-An Liu and Ruqing Zhang and Jiafeng Guo and Changjiang Zhou and Maarten de Rijke and Xueqi Cheng | null | null | https://arxiv.org/abs/2412.18768 | null | null | On the Robustness of Generative Information Retrieval Models | On the Robustness of Generative Information Retrieval Models | http://arxiv.org/pdf/2412.18768v1 | Generative information retrieval methods retrieve documents by directly
generating their identifiers. Much effort has been devoted to developing
effective generative IR models. Less attention has been paid to the robustness
of these models. It is critical to assess the out-of-distribution (OOD)
generalization of genera... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | liuRobustnessGenerativeRetrieval2023 | \cite{liuRobustnessGenerativeRetrieval2023} | On the Robustness of Generative Retrieval Models: An Out-of-Distribution Perspective | null | null | true | false | Yu{-}An Liu and Ruqing Zhang and Jiafeng Guo and Wei Chen and Xueqi Cheng | null | null | https://doi.org/10.48550/arXiv.2306.12756 | 10.48550/ARXIV.2306.12756 | CoRR | On the Robustness of Generative Retrieval Models: An Out-of-Distribution Perspective | On the Robustness of Generative Retrieval Models: An Out ... | https://arxiv.org/abs/2306.12756 | **arXiv:2306.12756** (cs) View a PDF of the paper titled On the Robustness of Generative Retrieval Models: An Out-of-Distribution Perspective, by Yu-An Liu and 4 other authors View a PDF of the paper titled On the Robustness of Generative Retrieval Models: An Out-of-Distribution Perspective, by Yu-An Liu and 4 other a... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | leeNonparametricDecodingGenerative2023 | \cite{leeNonparametricDecodingGenerative2023} | Nonparametric Decoding for Generative Retrieval | http://arxiv.org/abs/2210.02068v3 | The generative retrieval model depends solely on the information encoded in
its model parameters without external memory, its information capacity is
limited and fixed. To overcome the limitation, we propose Nonparametric
Decoding (Np Decoding) which can be applied to existing generative retrieval
models. Np Decoding u... | true | true | Lee, Hyunji and Kim, JaeYoung and Chang, Hoyeon and Oh, Hanseok and Yang, Sohee and Karpukhin, Vladimir and Lu, Yi and Seo, Minjoon | null | null | null | null | null | Nonparametric Decoding for Generative Retrieval | Nonparametric Decoding for Generative Retrieval | http://arxiv.org/pdf/2210.02068v3 | The generative retrieval model depends solely on the information encoded in
its model parameters without external memory, its information capacity is
limited and fixed. To overcome the limitation, we propose Nonparametric
Decoding (Np Decoding) which can be applied to existing generative retrieval
models. Np Decoding u... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | yuan2024generative-memory-burden | \cite{yuan2024generative-memory-burden} | Generative Dense Retrieval: Memory Can Be a Burden | http://arxiv.org/abs/2401.10487v1 | Generative Retrieval (GR), autoregressively decoding relevant document
identifiers given a query, has been shown to perform well under the setting of
small-scale corpora. By memorizing the document corpus with model parameters,
GR implicitly achieves deep interaction between query and document. However,
such a memorizi... | true | true | Peiwen Yuan and Xinglin Wang and Shaoxiong Feng and Boyuan Pan and Yiwei Li and Heda Wang and Xupeng Miao and Kan Li | null | null | https://aclanthology.org/2024.eacl-long.173 | null | null | Generative Dense Retrieval: Memory Can Be a Burden | Generative Dense Retrieval: Memory Can Be a Burden | http://arxiv.org/pdf/2401.10487v1 | Generative Retrieval (GR), autoregressively decoding relevant document
identifiers given a query, has been shown to perform well under the setting of
small-scale corpora. By memorizing the document corpus with model parameters,
GR implicitly achieves deep interaction between query and document. However,
such a memorizi... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | wangNOVOLearnableInterpretable2023 | \cite{wangNOVOLearnableInterpretable2023} | NOVO: Learnable and Interpretable Document Identifiers for Model-Based IR | null | null | true | false | Wang, Zihan and Zhou, Yujia and Tu, Yiteng and Dou, Zhicheng | null | null | https://doi.org/10.1145/3583780.3614993 | 10.1145/3583780.3614993 | null | NOVO: Learnable and Interpretable Document Identifiers for Model-Based IR | Learnable and Interpretable Document Identifiers for Model ... | https://www.researchgate.net/publication/374903378_NOVO_Learnable_and_Interpretable_Document_Identifiers_for_Model-Based_IR | NOVO [389] introduces learnable continuous N-gram DocIDs, refining embeddings through query denoising and retrieval tasks. LMIndexer [153] generates neural |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | kishoreIncDSI2023 | \cite{kishoreIncDSI2023} | IncDSI: Incrementally Updatable Document Retrieval | http://arxiv.org/abs/2307.10323v2 | Differentiable Search Index is a recently proposed paradigm for document
retrieval, that encodes information about a corpus of documents within the
parameters of a neural network and directly maps queries to corresponding
documents. These models have achieved state-of-the-art performances for
document retrieval across ... | true | true | Kishore, Varsha and Wan, Chao and Lovelace, Justin and Artzi, Yoav and Weinberger, Kilian Q. | null | null | null | null | null | IncDSI: Incrementally Updatable Document Retrieval | IncDSI: Incrementally Updatable Document Retrieval | http://arxiv.org/pdf/2307.10323v2 | Differentiable Search Index is a recently proposed paradigm for document
retrieval, that encodes information about a corpus of documents within the
parameters of a neural network and directly maps queries to corresponding
documents. These models have achieved state-of-the-art performances for
document retrieval across ... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | mehtaDSIpp2023 | \cite{mehtaDSIpp2023} | {DSI}++: Updating Transformer Memory with New Documents | null | null | true | false | Mehta, Sanket Vaibhav and Gupta, Jai and Tay, Yi and Dehghani, Mostafa and Tran, Vinh Q. and Rao, Jinfeng and Najork, Marc and Strubell, Emma and Metzler, Donald | null | null | https://aclanthology.org/2023.emnlp-main.510/ | 10.18653/v1/2023.emnlp-main.510 | null | {DSI}++: Updating Transformer Memory with New Documents | DSI++: Updating Transformer Memory with New Documents | https://aclanthology.org/2023.emnlp-main.510/ | DSI++: Updating Transformer Memory with New Documents - ACL Anthology Anthology ID:2023.emnlp-main.510 Volume:Proceedings of the 2023 Conference on Empirical Methods in Natural Language ProcessingMonth:December Year:2023 Address:Singapore Editors:Houda Bouamor, Juan Pino, Kalika BaliVenue:EMNLPSIG:Publisher:Association... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | guoContinualGenerative2024 | \cite{guoContinualGenerative2024} | CorpusBrain++: A Continual Generative Pre-Training Framework for Knowledge-Intensive Language Tasks | null | null | true | false | Jiafeng Guo and Changjiang Zhou and Ruqing Zhang and Jiangui Chen and Maarten de Rijke and Yixing Fan and Xueqi Cheng | null | null | https://arxiv.org/abs/2402.16767 | null | null | CorpusBrain++: A Continual Generative Pre-Training Framework for Knowledge-Intensive Language Tasks | [2402.16767] CorpusBrain++: A Continual Generative Pre-Training ... | https://arxiv.org/abs/2402.16767 | Title:CorpusBrain++: A Continual Generative Pre-Training Framework for Knowledge-Intensive Language Tasks View a PDF of the paper titled CorpusBrain++: A Continual Generative Pre-Training Framework for Knowledge-Intensive Language Tasks, by Jiafeng Guo and 5 other authors View a PDF of the paper titled CorpusBrain++: A... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | ahmedNeuroSymbolicLearning2023 | \cite{ahmedNeuroSymbolicLearning2023} | Semantic Strengthening of Neuro-Symbolic Learning | http://arxiv.org/abs/2302.14207v1 | Numerous neuro-symbolic approaches have recently been proposed typically with
the goal of adding symbolic knowledge to the output layer of a neural network.
Ideally, such losses maximize the probability that the neural network's
predictions satisfy the underlying domain. Unfortunately, this type of
probabilistic infere... | true | true | Ahmed, Kareem and Chang, Kai-Wei and Van den Broeck, Guy | null | 25--27 Apr | https://proceedings.mlr.press/v206/ahmed23a.html | null | null | Semantic Strengthening of Neuro-Symbolic Learning | [PDF] Semantic Strengthening of Neuro-Symbolic Learning | https://proceedings.mlr.press/v206/ahmed23a/ahmed23a.pdf | Neuro-symbolic learning aims to add symbolic knowledge to neural networks, using a probabilistic approach to scale inference while retaining sound semantics. |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | mustafaStrcutredOutputPrediction2021 | \cite{mustafaStrcutredOutputPrediction2021} | Fine-grained Generalization Analysis of Structured Output Prediction | http://arxiv.org/abs/2106.00115v1 | In machine learning we often encounter structured output prediction problems
(SOPPs), i.e. problems where the output space admits a rich internal structure.
Application domains where SOPPs naturally occur include natural language
processing, speech recognition, and computer vision. Typical SOPPs have an
extremely large... | true | true | Mustafa, Waleed and Lei, Yunwen and Ledent, Antoine and Kloft, Marius | null | null | https://doi.org/10.24963/ijcai.2021/391 | 10.24963/ijcai.2021/391 | null | Fine-grained Generalization Analysis of Structured Output Prediction | [PDF] Fine-grained Generalization Analysis of Structured Output Prediction | https://www.ijcai.org/proceedings/2021/0391.pdf | We consider two popular methods for structured output prediction: stochastic gradient descent (SGD) and reg- ularized risk minimization (RRM). We adapt the |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | nishinoGeneralizationAnalysisLearning2022a | \cite{nishinoGeneralizationAnalysisLearning2022a} | Generalization Analysis on Learning with a Concurrent Verifier | http://arxiv.org/abs/2210.05331v1 | Machine learning technologies have been used in a wide range of practical
systems. In practical situations, it is natural to expect the input-output
pairs of a machine learning model to satisfy some requirements. However, it is
difficult to obtain a model that satisfies requirements by just learning from
examples. A si... | true | true | Nishino, Masaaki and Nakamura, Kengo and Yasuda, Norihito | null | null | null | null | null | Generalization Analysis on Learning with a Concurrent Verifier | Generalization Analysis on Learning with a Concurrent Verifier | http://arxiv.org/pdf/2210.05331v1 | Machine learning technologies have been used in a wide range of practical
systems. In practical situations, it is natural to expect the input-output
pairs of a machine learning model to satisfy some requirements. However, it is
difficult to obtain a model that satisfies requirements by just learning from
examples. A si... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | nishinoUnderstandingCV2025 | \cite{nishinoUnderstandingCV2025} | Understanding the impact of introducing constraints at inference time on generalization error | null | null | true | false | Nishino, Masaaki and Nakamura, Kengo and Yasuda, Norihito | null | null | null | null | null | Understanding the impact of introducing constraints at inference time on generalization error | [PDF] Understanding the Impact of Introducing Constraints at Inference ... | https://raw.githubusercontent.com/mlresearch/v235/main/assets/nishino24a/nishino24a.pdf | This paper analyses how the generalization error bounds change when we only put constraints in the inference time. Our main finding is that a class of loss |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zhangSurveyControllableText2023 | \cite{zhangSurveyControllableText2023} | A Survey of Controllable Text Generation using Transformer-based
Pre-trained Language Models | http://arxiv.org/abs/2201.05337v5 | Controllable Text Generation (CTG) is emerging area in the field of natural
language generation (NLG). It is regarded as crucial for the development of
advanced text generation technologies that better meet the specific constraints
in practical applications. In recent years, methods using large-scale
pre-trained langua... | true | true | Zhang, Hanqing and Song, Haolin and Li, Shaoyu and Zhou, Ming and Song, Dawei | null | null | https://doi.org/10.1145/3617680 | 10.1145/3617680 | ACM Comput. Surv. | A Survey of Controllable Text Generation using Transformer-based
Pre-trained Language Models | A Survey of Controllable Text Generation Using Transformer-based ... | https://dl.acm.org/doi/10.1145/3617680 | This article is closely related to two key aspects: controllable text generation and pre-trained language models, which will be briefly introduced in this |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | mireshghallahControllableTextGeneration2022 | \cite{mireshghallahControllableTextGeneration2022} | Mix and Match: Learning-free Controllable Text Generation using Energy
Language Models | http://arxiv.org/abs/2203.13299v2 | Recent work on controlled text generation has either required attribute-based
fine-tuning of the base language model (LM), or has restricted the
parameterization of the attribute discriminator to be compatible with the base
autoregressive LM. In this work, we propose Mix and Match LM, a global
score-based alternative f... | true | true | Mireshghallah, Fatemehsadat and Goyal, Kartik and Berg-Kirkpatrick, Taylor | null | null | https://aclanthology.org/2022.acl-long.31/ | 10.18653/v1/2022.acl-long.31 | null | Mix and Match: Learning-free Controllable Text Generation using Energy
Language Models | Mix and Match: Learning-free Controllable Text Generation ... | https://cseweb.ucsd.edu/~fmireshg/acl2022_mix_match.pdf | by F Mireshghallah · Cited by 86 — We interpret the task of controllable generation as drawing samples from an energy-based model whose energy values are a linear combination of scores from black |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | mudgalControlledDecoding2025 | \cite{mudgalControlledDecoding2025} | Controlled Decoding from Language Models | http://arxiv.org/abs/2310.17022v3 | KL-regularized reinforcement learning (RL) is a popular alignment framework
to control the language model responses towards high reward outcomes. We pose a
tokenwise RL objective and propose a modular solver for it, called controlled
decoding (CD). CD exerts control through a separate prefix scorer module, which
is tra... | true | true | Mudgal, Sidharth and Lee, Jong and Ganapathy, Harish and Li, YaGuang and Wang, Tao and Huang, Yanping and Chen, Zhifeng and Cheng, Heng-Tze and Collins, Michael and Strohman, Trevor and Chen, Jilin and Beutel, Alex and Beirami, Ahmad | null | null | null | null | null | Controlled Decoding from Language Models | Controlled Decoding from Language Models | http://arxiv.org/pdf/2310.17022v3 | KL-regularized reinforcement learning (RL) is a popular alignment framework
to control the language model responses towards high reward outcomes. We pose a
tokenwise RL objective and propose a modular solver for it, called controlled
decoding (CD). CD exerts control through a separate prefix scorer module, which
is tra... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | kimCriticGuidedDecoding2023 | \cite{kimCriticGuidedDecoding2023} | Critic-Guided Decoding for Controlled Text Generation | http://arxiv.org/abs/2212.10938v1 | Steering language generation towards objectives or away from undesired
content has been a long-standing goal in utilizing language models (LM). Recent
work has demonstrated reinforcement learning and weighted decoding as effective
approaches to achieve a higher level of language control and quality with pros
and cons. ... | true | true | Kim, Minbeom and Lee, Hwanhee and Yoo, Kang Min and Park, Joonsuk and Lee, Hwaran and Jung, Kyomin | null | null | https://aclanthology.org/2023.findings-acl.281/ | 10.18653/v1/2023.findings-acl.281 | null | Critic-Guided Decoding for Controlled Text Generation | [2212.10938] Critic-Guided Decoding for Controlled Text Generation | https://arxiv.org/abs/2212.10938 | View a PDF of the paper titled Critic-Guided Decoding for Controlled Text Generation, by Minbeom Kim and 5 other authors In this work, we propose a novel critic decoding method for controlled language generation (CriticControl) that combines the strengths of reinforcement learning and weighted decoding. View a PDF of t... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | chakrabortyPrincipledDecodingLLM2024 | \cite{chakrabortyPrincipledDecodingLLM2024} | Transfer Q Star: Principled Decoding for LLM Alignment | http://arxiv.org/abs/2405.20495v1 | Aligning foundation models is essential for their safe and trustworthy
deployment. However, traditional fine-tuning methods are computationally
intensive and require updating billions of model parameters. A promising
alternative, alignment via decoding, adjusts the response distribution directly
without model updates t... | true | true | Chakraborty, Souradip and Ghosal, Soumya Suvra and Yin, Ming and Manocha, Dinesh and Wang, Mengdi and Bedi, Amrit Singh and Huang, Furong | null | null | null | null | arXiv preprint arXiv:2405.20495 | Transfer Q Star: Principled Decoding for LLM Alignment | Transfer Q Star: Principled Decoding for LLM Alignment | http://arxiv.org/pdf/2405.20495v1 | Aligning foundation models is essential for their safe and trustworthy
deployment. However, traditional fine-tuning methods are computationally
intensive and require updating billions of model parameters. A promising
alternative, alignment via decoding, adjusts the response distribution directly
without model updates t... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | kimGuaranteedGenerationLarge2024 | \cite{kimGuaranteedGenerationLarge2024} | Guaranteed Generation from Large Language Models | http://arxiv.org/abs/2410.06716v2 | As large language models (LLMs) are increasingly used across various
applications, there is a growing need to control text generation to satisfy
specific constraints or requirements. This raises a crucial question: Is it
possible to guarantee strict constraint satisfaction in generated outputs while
preserving the dist... | true | true | Minbeom Kim and Thibaut Thonet and Jos Rozen and Hwaran Lee and Kyomin Jung and Marc Dymetman | null | null | https://arxiv.org/abs/2410.06716 | null | null | Guaranteed Generation from Large Language Models | Guaranteed Generation from Large Language Models | http://arxiv.org/pdf/2410.06716v2 | As large language models (LLMs) are increasingly used across various
applications, there is a growing need to control text generation to satisfy
specific constraints or requirements. This raises a crucial question: Is it
possible to guarantee strict constraint satisfaction in generated outputs while
preserving the dist... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | honghuaLogicalControl2024 | \cite{honghuaLogicalControl2024} | Adaptable Logical Control for Large Language Models | http://arxiv.org/abs/2406.13892v2 | Despite the success of Large Language Models (LLMs) on various tasks
following human instructions, controlling model generation at inference time
poses a persistent challenge. In this paper, we introduce Ctrl-G, an adaptable
framework that facilitates tractable and flexible control of LLM generation to
reliably follow ... | true | true | Honghua Zhang and Po-Nien Kung and Masahiro Yoshida and Guy Van den Broeck and Nanyun Peng | null | null | https://openreview.net/forum?id=58X9v92zRd | null | null | Adaptable Logical Control for Large Language Models | Adaptable Logical Control for Large Language Models | http://arxiv.org/pdf/2406.13892v2 | Despite the success of Large Language Models (LLMs) on various tasks
following human instructions, controlling model generation at inference time
poses a persistent challenge. In this paper, we introduce Ctrl-G, an adaptable
framework that facilitates tractable and flexible control of LLM generation to
reliably follow ... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zhangTractableControlAutoregressive2023 | \cite{zhangTractableControlAutoregressive2023} | Tractable Control for Autoregressive Language Generation | http://arxiv.org/abs/2304.07438v4 | Despite the success of autoregressive large language models in text
generation, it remains a major challenge to generate text that satisfies
complex constraints: sampling from the conditional distribution
${\Pr}(\text{text} | \alpha)$ is intractable for even the simplest lexical
constraints $\alpha$. To overcome this c... | true | true | Zhang, Honghua and Dang, Meihua and Peng, Nanyun and Van Den Broeck, Guy | null | null | null | null | null | Tractable Control for Autoregressive Language Generation | Tractable Control for Autoregressive Language Generation | http://arxiv.org/pdf/2304.07438v4 | Despite the success of autoregressive large language models in text
generation, it remains a major challenge to generate text that satisfies
complex constraints: sampling from the conditional distribution
${\Pr}(\text{text} | \alpha)$ is intractable for even the simplest lexical
constraints $\alpha$. To overcome this c... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | liTreeIndexDenseRetrieval2023 | \cite{liTreeIndexDenseRetrieval2023} | Constructing Tree-based Index for Efficient and Effective Dense
Retrieval | http://arxiv.org/abs/2304.11943v1 | Recent studies have shown that Dense Retrieval (DR) techniques can
significantly improve the performance of first-stage retrieval in IR systems.
Despite its empirical effectiveness, the application of DR is still limited. In
contrast to statistic retrieval models that rely on highly efficient inverted
index solutions, ... | true | true | Li, Haitao and Ai, Qingyao and Zhan, Jingtao and Mao, Jiaxin and Liu, Yiqun and Liu, Zheng and Cao, Zhao | null | null | https://doi.org/10.1145/3539618.3591651 | 10.1145/3539618.3591651 | null | Constructing Tree-based Index for Efficient and Effective Dense
Retrieval | Constructing Tree-based Index for Efficient and Effective ... | https://arxiv.org/abs/2304.11943 | by H Li · 2023 · Cited by 29 — The tree-based negative sampling strategy is applied to make the tree have the maximum heap property, which supports the effectiveness of beam ...See more |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zhuTreeRecsys2018 | \cite{zhuTreeRecsys2018} | Learning Tree-based Deep Model for Recommender Systems | http://arxiv.org/abs/1801.02294v5 | Model-based methods for recommender systems have been studied extensively in
recent years. In systems with large corpus, however, the calculation cost for
the learnt model to predict all user-item preferences is tremendous, which
makes full corpus retrieval extremely difficult. To overcome the calculation
barriers, mod... | true | true | Zhu, Han and Li, Xiang and Zhang, Pengye and Li, Guozheng and He, Jie and Li, Han and Gai, Kun | null | null | https://doi.org/10.1145/3219819.3219826 | 10.1145/3219819.3219826 | null | Learning Tree-based Deep Model for Recommender Systems | [PDF] Learning Tree-based Deep Model for Recommender Systems - arXiv | https://arxiv.org/pdf/1801.02294 | In this paper, we focus on the problem of introducing arbitrary advanced models to recommender systems with large corpus. We propose a novel tree-based method |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zhuoOptimalTreeModels2020 | \cite{zhuoOptimalTreeModels2020} | Learning Optimal Tree Models Under Beam Search | http://arxiv.org/abs/2006.15408v1 | Retrieving relevant targets from an extremely large target set under
computational limits is a common challenge for information retrieval and
recommendation systems. Tree models, which formulate targets as leaves of a
tree with trainable node-wise scorers, have attracted a lot of interests in
tackling this challenge du... | true | true | Zhuo, Jingwei and Xu, Ziru and Dai, Wei and Zhu, Han and Li, Han and Xu, Jian and Gai, Kun | null | null | null | null | null | Learning Optimal Tree Models Under Beam Search | Learning Optimal Tree Models Under Beam Search | http://arxiv.org/pdf/2006.15408v1 | Retrieving relevant targets from an extremely large target set under
computational limits is a common challenge for information retrieval and
recommendation systems. Tree models, which formulate targets as leaves of a
tree with trainable node-wise scorers, have attracted a lot of interests in
tackling this challenge du... |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zhuJointTreeIndexRecsys2019 | \cite{zhuJointTreeIndexRecsys2019} | Joint Optimization of Tree-based Index and Deep Model for Recommender
Systems | http://arxiv.org/abs/1902.07565v2 | Large-scale industrial recommender systems are usually confronted with
computational problems due to the enormous corpus size. To retrieve and
recommend the most relevant items to users under response time limits,
resorting to an efficient index structure is an effective and practical
solution. The previous work Tree-b... | true | true | Zhu, Han and Chang, Daqing and Xu, Ziru and Zhang, Pengye and Li, Xiang and He, Jie and Li, Han and Xu, Jian and Gai, Kun | null | null | null | null | null | Joint Optimization of Tree-based Index and Deep Model for Recommender
Systems | [PDF] Joint Optimization of Tree-based Index and Deep Model for ... | http://papers.neurips.cc/paper/8652-joint-optimization-of-tree-based-index-and-deep-model-for-recommender-systems.pdf | In tree-based recommendation methods, the quality of both the tree index and the user-node preference prediction model determines the recommendation accuracy. |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | zengPlanningAheadGenerative2024 | \cite{zengPlanningAheadGenerative2024} | Planning Ahead in Generative Retrieval: Guiding Autoregressive
Generation through Simultaneous Decoding | http://arxiv.org/abs/2404.14600v1 | This paper introduces PAG-a novel optimization and decoding approach that
guides autoregressive generation of document identifiers in generative
retrieval models through simultaneous decoding. To this aim, PAG constructs a
set-based and sequential identifier for each document. Motivated by the
bag-of-words assumption i... | true | true | Hansi Zeng and Chen Luo and Hamed Zamani | null | null | https://doi.org/10.1145/3626772.3657746 | 10.1145/3626772.3657746 | null | Planning Ahead in Generative Retrieval: Guiding Autoregressive
Generation through Simultaneous Decoding | [2404.14600] Planning Ahead in Generative Retrieval | https://arxiv.org/abs/2404.14600 | by H Zeng · 2024 · Cited by 21 — This paper introduces PAG-a novel optimization and decoding approach that guides autoregressive generation of document identifiers in generative retrieval |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | liCorpusLM2024 | \cite{liCorpusLM2024} | CorpusLM: Towards a Unified Language Model on Corpus for
Knowledge-Intensive Tasks | http://arxiv.org/abs/2402.01176v2 | Large language models (LLMs) have gained significant attention in various
fields but prone to hallucination, especially in knowledge-intensive (KI)
tasks. To address this, retrieval-augmented generation (RAG) has emerged as a
popular solution to enhance factual accuracy. However, traditional retrieval
modules often rel... | true | true | Xiaoxi Li and Zhicheng Dou and Yujia Zhou and Fangchao Liu | null | null | https://doi.org/10.1145/3626772.3657778 | 10.1145/3626772.3657778 | null | CorpusLM: Towards a Unified Language Model on Corpus for
Knowledge-Intensive Tasks | CorpusLM: Towards a Unified Language Model on Corpus ... | https://dl.acm.org/doi/10.1145/3626772.3657778 | In this paper, we propose CorpusLM, a unified language model that leverages external corpus to tackle various knowledge-intensive tasks. |
Constrained Auto-Regressive Decoding Constrains Generative Retrieval | 2504.09935v1 | liUnigen2024 | \cite{liUnigen2024} | UniGen: A Unified Generative Framework for Retrieval and Question
Answering with Large Language Models | http://arxiv.org/abs/2312.11036v1 | Generative information retrieval, encompassing two major tasks of Generative
Document Retrieval (GDR) and Grounded Answer Generation (GAR), has gained
significant attention in the area of information retrieval and natural language
processing. Existing methods for GDR and GAR rely on separate retrieval and
reader module... | true | true | Xiaoxi Li and Yujia Zhou and Zhicheng Dou | null | null | https://doi.org/10.1609/aaai.v38i8.28714 | 10.1609/AAAI.V38I8.28714 | null | UniGen: A Unified Generative Framework for Retrieval and Question
Answering with Large Language Models | UniGen: A Unified Generative Framework for Retrieval and Question ... | https://underline.io/lecture/93708-unigen-a-unified-generative-framework-for-retrieval-and-question-answering-with-large-language-models | UniGen: A Unified Generative Framework for Retrieval and Question Answering with Large Language Models |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.