Submitted by RAFisherman t3_114d166 in MachineLearning
2dayiownu t1_j8wpinb wrote
Temporal Fusion Transformer
emotionalfool123 t1_j8wvvda wrote
This lib has lot of implementations including one you mentioned.
dj_ski_mask t1_j8x2m11 wrote
I am knee deep in this library at work right now.
Pros: they implement tons of algos and regularly update with the ‘latest and greatest,’ like NHITS. Also can scale with GPUs/TPUs for the algos that use Torch backend. Depending on the algo you can add covariates and the “global” models for multivariate time series are impressive in their performance.
Cons: my god it’s a finicky library that takes considerable time to pick up. Weird syntax/restrictions for scoring and evaluating. Differentiating between “past” and “future” covariates is not as cut and dried as documentation makes it seem. Also, limited tutorials and examples.
All in all I like it and am making a speed run to learning this library for my time series needs.
To OP I would suggest NHITS, but also, the tree based methods STILL tend to win with the data I work with.
emotionalfool123 t1_j8x49h8 wrote
Then it seems this is equivalent to the confusion R timeseries libraries cause.
clisztian t1_j8z3t1r wrote
I guarantee you a state space model will beat out any fancy named transformer for most “forecastable” problems. Even MDFA - signal extraction + exp integration for forecasting - will beat out these big ML models
dj_ski_mask t1_j9345mc wrote
I should mention with some tuning I have been able to get NHITS to outperform Naive Seasonal, CatBoost with lags, and ES models, so it’s not terrible.
Viewing a single comment thread. View all comments