sima_utils.transformer.default_llm_config ========================================= .. py:module:: sima_utils.transformer.default_llm_config Attributes ---------- .. autoapisummary:: sima_utils.transformer.default_llm_config.LLM_TOKENIZER_TYPE sima_utils.transformer.default_llm_config.LLM_VOCAB_SIZE sima_utils.transformer.default_llm_config.LLM_EMBED_DIM sima_utils.transformer.default_llm_config.LLM_INTERMEDIATE_SIZE sima_utils.transformer.default_llm_config.LLM_NUM_HIDDEN_LAYERS sima_utils.transformer.default_llm_config.LLM_NUM_ATTENTION_HEADS sima_utils.transformer.default_llm_config.LLM_HEAD_DIMENSION sima_utils.transformer.default_llm_config.LLM_NUM_KEY_VALUE_HEADS sima_utils.transformer.default_llm_config.LLM_CONTEXT_LENGTH sima_utils.transformer.default_llm_config.LLM_ROPE sima_utils.transformer.default_llm_config.LLM_SLIDING_WINDOW_ATTENTION sima_utils.transformer.default_llm_config.LLM_HIDDEN_ACT sima_utils.transformer.default_llm_config.LLM_LAYER_NORM Functions --------- .. autoapisummary:: sima_utils.transformer.default_llm_config.get_llm_parameter Module Contents --------------- .. py:data:: LLM_TOKENIZER_TYPE .. py:data:: LLM_VOCAB_SIZE .. py:data:: LLM_EMBED_DIM .. py:data:: LLM_INTERMEDIATE_SIZE .. py:data:: LLM_NUM_HIDDEN_LAYERS .. py:data:: LLM_NUM_ATTENTION_HEADS .. py:data:: LLM_HEAD_DIMENSION .. py:data:: LLM_NUM_KEY_VALUE_HEADS .. py:data:: LLM_CONTEXT_LENGTH .. py:data:: LLM_ROPE .. py:data:: LLM_SLIDING_WINDOW_ATTENTION .. py:data:: LLM_HIDDEN_ACT .. py:data:: LLM_LAYER_NORM .. py:function:: get_llm_parameter(cfg: dict, key: str, default_if_missing: any) -> any Get a parameter by name in the configuration dictionary. :param cfg: The configuration dictionary. :type cfg: dict :param key: The name of the parameter. :type key: str :param default_if_missing: The value for the parameter if missing. :type default_if_missing: any :returns: The value of the named parameter.