[dev] C3 - Replace configuration dictionaries with DictConfigs.#3194
Conversation
ffdb60a to
9dbd2b3
Compare
e290a0c to
8b5d13f
Compare
6c77aad to
f3014f1
Compare
f3014f1 to
c36ce50
Compare
There was a problem hiding this comment.
Pull request overview
This PR migrates the PyTorch pose pipeline from raw dictionaries to typed configuration classes (PoseConfig, ProjectConfig, ModelConfig) with OmegaConf DictConfig support, introducing a config versioning and migration system for backward compatibility.
Changes:
- Introduced config versioning system with migration support between versions
- Replaced raw dict configs with typed config classes that validate against Pydantic models
- Updated loaders and model initialization to use
from_any()factory methods for flexible config input
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| deeplabcut/pose_estimation_pytorch/models/model.py | Updated PoseModel to accept ModelConfig/DictConfig/dict/Path/str and convert to validated DictConfig |
| deeplabcut/pose_estimation_pytorch/data/dlcloader.py | Refactored DLCLoader to use ProjectConfig with typed validation and OmegaConf for nested access |
| deeplabcut/pose_estimation_pytorch/data/cocoloader.py | Added model_config parameter to replace deprecated model_config_path |
| deeplabcut/pose_estimation_pytorch/data/base.py | Introduced legacy argument handling for model_config_path with deprecation warning |
| deeplabcut/pose_estimation_pytorch/config/make_pose_config.py | Refactored config creation to use typed configs and split defaults loading into separate function |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
c36ce50 to
9ce60b5
Compare
There was a problem hiding this comment.
Overall very good, but I would change/check a few things in this one.
Also to have this written down somewhere, I'd strongly favor checking whether we truly need OmegaConf and its features; I feel we do, even if we do not take full advantage of them yet, but keeping dependencies lean is also a very important consideration imo.
You probably have a better outlook than I do, but another path I see is just making pydantic dataclasses have dict-like access and use that in downstream code.
7c7f018 to
120aedc
Compare
f5a825c to
2d3ef49
Compare
aa82654 to
1dff963
Compare
2d3ef49 to
b5f2fdc
Compare
1dff963 to
5272d90
Compare
b5f2fdc to
fdfec16
Compare
530347a to
4350295
Compare
…reading from yaml.
1d24b90 to
00fd419
Compare
The head_cfg (a part of PoseConfig) was used for dumping modules. This is not supported for typed configs. The container is now replaced with a plain dict type instead of config.
00fd419 to
94cdc66
Compare
|
@C-Achard, some relevant additional changes: Pathlib and Enum serialization (see 538ca3a and 347ff18): Empty yaml field handling (see b890d6c): e.g. # myconfig.yaml
Task: openfield
scorer: Pranav
multianimalproject:
identity:results in |
This PR is part of the WIP for migrating from dictionary configs to typed & validated configurations, and follows PR #3191. (see issue #3193 for an overview).
Summary:
Refactors the PyTorch pose pipeline to use typed config classes (PoseConfig, ProjectConfig, ModelConfig) and OmegaConf DictConfig instead of raw dicts.
Main changes:
make_pose_confignow creates an OmegaConf DictConfig validated against PoseConfig dataclass