Skip to content
This repository has been archived by the owner on Dec 11, 2022. It is now read-only.

Custom Head instantiation is not supported #261

Open
redknightlois opened this issue Mar 22, 2019 · 2 comments
Open

Custom Head instantiation is not supported #261

redknightlois opened this issue Mar 22, 2019 · 2 comments

Comments

@redknightlois
Copy link
Contributor

redknightlois commented Mar 22, 2019

Using Version 0.11.1

I wanted to modify a particular head in order to modify some calculations fullfilling the agent requirements and found that you cannot instantiate the new head if it doesnt live in the couch namespace.

For example, I defined a custom CIL head like:

class CilHeadParameters(HeadParameters):
    def __init__(self, activation_function: str ='relu', name: str='q_head_params',
                 num_output_head_copies: int = 1, rescale_gradient_from_head_by_factor: float = 1.0,
                 loss_weight: float = 1.0, dense_layer=None, scheme=None):
        super().__init__(parameterized_class_name="cil:CilHead", activation_function=activation_function, name=name,
                         dense_layer=dense_layer, num_output_head_copies=num_output_head_copies,
                         rescale_gradient_from_head_by_factor=rescale_gradient_from_head_by_factor,
                         loss_weight=loss_weight)


class CilHead(Head):
    def __init__(self, agent_parameters: AgentParameters, spaces: SpacesDefinition, network_name: str,
                 head_idx: int = 0, loss_weight: float = 1., is_local: bool = True, activation_function: str='relu',
                 dense_layer=Dense, scheme=[Dense(256), Dense(256)]):
        super().__init__(agent_parameters, spaces, network_name, head_idx, loss_weight, is_local, activation_function,
                         dense_layer=dense_layer)
        self.name = 'regression_head'
        ...

    def _build_module(self, input_layer):
        ....

    def __str__(self):
        result = []
        for layer in self.layers:
            result.append(str(layer))
        return '\n'.join(result)

If you notice this head lives in the cil module. However when you execute you get:

  File "C:\Anaconda3\lib\site-packages\rl_coach\architectures\tensorflow_components\architecture.py", line 105, in __init__
    self.get_model()
  File "C:\Anaconda3\lib\site-packages\rl_coach\architectures\tensorflow_components\general_network.py", line 305, in get_model
    head_idx*head_params.num_output_head_copies + head_copy_idx)
  File "C:\Anaconda3\lib\site-packages\rl_coach\architectures\tensorflow_components\general_network.py", line 223, in get_output_head
    'head_idx': head_idx, 'is_local': self.network_is_local})
  File "C:\Anaconda3\lib\site-packages\rl_coach\utils.py", line 400, in dynamic_import_and_instantiate_module_from_params
    module = short_dynamic_import(path)
  File "C:\Anaconda3\lib\site-packages\rl_coach\utils.py", line 356, in short_dynamic_import
    ignore_module_case=ignore_module_case)
TypeError: dynamic_import() got multiple values for argument 'ignore_module_case'

What happens is that the loading method is not able to figure out that we have explicitely declared the module and therefore is concatenating it into the string before calling short_dynamic_import.

@gal-leibovich
Copy link
Contributor

Please provide more detail. What did you run which triggered this exception?

@redknightlois
Copy link
Contributor Author

redknightlois commented Mar 25, 2019

The repro is pretty simple.

  • Install Coach as a library (pip install rl_coach)
  • Create a new project that does some simple stuff.
  • Copy the code of the head you are using to the __main__ module
  • Use that head instead of the default one.
  • Run and see it fail in this way.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
2 participants