household_gpt_base¶
- class pyrecodes.household.household_gpt_base.HouseholdGPTBase¶
Bases:
objectShared GPT machinery for all household agent types.
Not a Household itself — subclasses that participate in the regional simulation must also inherit from Household. Survey-only agents need only this base.
- PROMPTS_FILE = None¶
- async async_inform_gpt(inform_gpt_method: str) None¶
- async async_prompt_llm(prompt: dict, print_answer: bool = False, add_to_past_experience: bool = False) str¶
- async async_read_literature(file_name: str = './pyrecodes/household/literature_summaries.json') None¶
- async async_read_ruleset() None¶
- create_llm(api_key_filename: str, temperature: float, llm_model: str, summarize_experience: bool = True) None¶
- format_household_options_string(household_options: list) str¶
- get_decision(answer: str)¶
- inform_gpt(inform_gpt_method: str) None¶
- print_chat_history(text_width: int = 80) None¶
- prompt_llm(prompt: dict, print_answer: bool = False, add_to_past_experience: bool = False) str¶
- read_literature(file_name: str = './pyrecodes/household/literature_summaries.json') None¶
- read_ruleset() None¶
- static split_text_into_batches(text: str, max_words: int = 500)¶
- string_to_dict(string: str) dict¶
- class pyrecodes.household.household_gpt_base.LLM(api_key_filename: str, temperature: float = 1.0, llm_model: str = 'GPT', summarize_experience: bool = True)¶
Bases:
objectWraps OpenAI GPT, maintaining chat history and prompt management.
- add_to_chat_history(prompt: dict) None¶
- add_to_past_experience(answer: str) None¶
- add_to_relevant_prompts(prompt: dict) None¶
- get_past_experience_string() str¶
- prompt(prompt: dict, print_answer: bool = False, add_to_past_experience: bool = False) str¶
- async prompt_async(prompt: dict, print_answer: bool = False, add_to_past_experience: bool = False) str¶
- query_llm(messages: list) str¶
- set_llm_model(llm_model: str, api_key_filename: str) None¶
- update_past_experience(description_prompt: dict, latest_answer: str) None¶
- async update_past_experience_async(description_prompt: dict, latest_answer: str) None¶