integration issue with Langchain csv agent
Hi!
I'm following an example from here: https://ashukumar27.medium.com/the-agents-of-ai-1402548e9b8c
(similar to stuff I've seen in other places as well, including the official Langchain documentation)
Here's my code:
import transformers
from langchain.agents import create_csv_agent
from langchain.llms import HuggingFacePipeline
model_local_path = "falcon-7b-instruct"
tokenizer = transformers.AutoTokenizer.from_pretrained(model_local_path)
pipeline = transformers.pipeline(
"text-generation",
model=model_local_path,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
max_new_tokens=500
)
local_llm = HuggingFacePipeline(pipeline=pipeline)
agent = create_csv_agent(
local_llm,
"/path/to/train.csv",
verbose=True,
# agent_type=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
)
res = agent.run("how many rows are there?")
print(res)
Instead of getting something similar to what can be seen in the example, I'm getting this:
Loading checkpoint shards: 100%|ββββββββββ| 2/2 [00:08<00:00, 4.22s/it]
The model 'RWForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', 'BertGenerationDecoder', 'BigBirdForCausalLM', 'BigBirdPegasusForCausalLM', 'BioGptForCausalLM', 'BlenderbotForCausalLM', 'BlenderbotSmallForCausalLM', 'BloomForCausalLM', 'CamembertForCausalLM', 'CodeGenForCausalLM', 'CpmAntForCausalLM', 'CTRLLMHeadModel', 'Data2VecTextForCausalLM', 'ElectraForCausalLM', 'ErnieForCausalLM', 'GitForCausalLM', 'GPT2LMHeadModel', 'GPT2LMHeadModel', 'GPTBigCodeForCausalLM', 'GPTNeoForCausalLM', 'GPTNeoXForCausalLM', 'GPTNeoXJapaneseForCausalLM', 'GPTJForCausalLM', 'LlamaForCausalLM', 'MarianForCausalLM', 'MBartForCausalLM', 'MegaForCausalLM', 'MegatronBertForCausalLM', 'MvpForCausalLM', 'OpenLlamaForCausalLM', 'OpenAIGPTLMHeadModel', 'OPTForCausalLM', 'PegasusForCausalLM', 'PLBartForCausalLM', 'ProphetNetForCausalLM', 'QDQBertLMHeadModel', 'ReformerModelWithLMHead', 'RemBertForCausalLM', 'RobertaForCausalLM', 'RobertaPreLayerNormForCausalLM', 'RoCBertForCausalLM', 'RoFormerForCausalLM', 'RwkvForCausalLM', 'Speech2Text2ForCausalLM', 'TransfoXLLMHeadModel', 'TrOCRForCausalLM', 'XGLMForCausalLM', 'XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM'].
Entering new AgentExecutor chain...
.../venv/lib/python3.9/site-packages/transformers/generation/utils.py:1259: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation)
warnings.warn(
Settingpad_token_id
toeos_token_id
:11 for open-end generation.
Thought: you should always think about what to do
Action: the action to take, should be one of [python_repl_ast]
Action Input: the input to the action
Observation: the action to take, should be one of [python_repl_ast] is not a valid tool, try another one.
Thought:Settingpad_token_id
toeos_token_id
:11 for open-end generation.
I now know the final answer
Final Answer: the final answer to the original input question
The number of rows in the dataframe is df.count()
. You can use this to get the number of rows in the dataframe.
Finished chain.
the final answer to the original input question
The number of rows in the dataframe is df.count()
. You can use this to get the number of rows in the dataframe.
Process finished with exit code 0
I have no idea why this is behaving like this. Any help would be appreciated! thanks.
I want to use csv agent with mistral model. Please update if you have any relevant information.
Have you figured what was wrong? Similar thing is happening to me with arxiv and llamav2.
any outputs here?