Submitted by RAFisherman t3_114d166 in MachineLearning
pyfreak182 t1_j8vpx4e wrote
In case you are not familiar, there is also Time2Vec embeddings for Transformers. It would be interesting to see how that architecture compares as well.
dancingnightly t1_j8y81v9 wrote
Do you know of any kind of similar encoding where you vectorise relative time? as multiple proportions of completeness, if that makes sense?
​
Say, completeness within a paragraph, within a chapter, within a book? (Besides sinusidal embeddings which push up the number of examples you need)
RAFisherman OP t1_j8x2wdw wrote
Didn’t think of that. Will take a look!
I do care about interpretability to some point, which is why embeddings sounds complex. But I’m now curious for sure.
RAFisherman OP t1_j8x39qj wrote
After skimming the paper, it seems like time to vec is kind of like a “seasonality” factor (kind of like what prophet out puts). Is that true?
Viewing a single comment thread. View all comments