Comments
[deleted] t1_ithgqz6 wrote
[removed]
BITE_AU_CHOCOLAT t1_ithud6t wrote
Eh... I'm currently training a model with 700M parameters (most of which being in the emdeddings used as input, not so much the hidden layers themselves) and Pytorch pretty much required at least 50GB per GPU while Tensorflow was happy to train on 3090s, which were way, wayyyy cheaper to rent than a6000s even though Pytorch managed better GPU utilization. So I think I'm just gonna stick with TF/Keras and TFLite for now.
learn-deeply t1_itikofd wrote
PyTorch doesn't inherently use more or less memory than TensorFlow, there's a bug in your code. If it's easier to switch frameworks than debug, more power to you.
BITE_AU_CHOCOLAT t1_itiofjz wrote
Well, I haven't "switched" since I've been using Tensorflow since the start of the project. I was just curious to see if Pytorch could allow me to squeeze more juice and after spending a weekend trying to learn assembly Pytorch syntax it turns out that yes, but actually no. So yeah I'm perfectly content with using model.fit and calling it a day for the time being.
Oh and I also forgot: Pytorch won't train with a distributed strategy in a Jupyter environment. KEK.
jaschau t1_itguay4 wrote
Am I the only one for who the announcement feels a bit out of touch with reality?
sharky6000 t1_iti87n8 wrote
I am not a fan of TF by any means, but:
> It’s the 3rd most-starred software repository on GitHub (right behind Vue and React) and the most-downloaded machine learning package on PyPI
Can't really make that stuff up. There are quite a lot of TF users out there.
jaschau t1_itk0o02 wrote
Completely agree with you there that the numbers are certainly correct. I just felt like they might not tell the whole story. For example, looking at the section where they say, I paraphrase, "x preprints are uploaded every day that mention TF", I don't doubt the numbers, but the way they tell it certainly evokes a different image compared to saying "the share of preprints relying on TF has been steadily declining the past years".
Regarding the many TF users out there, I would be curious what the main benefit is for them. Is it TPU support, TF serving, TF lite, something else?
LetterRip t1_itiqjqp wrote
Everyone downloads pytorch directly from the pytorch site, so it is somewhat misleading.
drinkingsomuchcoffee t1_ithb1xs wrote
Unsurprising. Google's been out of touch with reality for awhile now. That's what happens when you have a near monopoly (besides Apple). Despite the claims of how elite they are, the APIs they produce are pretty garbage except for a few lucky hits like JAX.
VirtualHat t1_itkajqr wrote
I use Pytorch every day and haven't gone back to TF for years. That being said, there are lots of old projects still on TF, and indeed on the older 1.x version before they fixed most of the stuff.
I'm glad they're working on XLA and JAX though.
johnnymo1 t1_ith4dxp wrote
I'm happy to learn about this KerasCV package. I've been using the TensorFlow Object Detection API for work lately and I hate it.
puppet_pals t1_ithf7m4 wrote
Luke here from KerasCV - the object detection API is still under active development. I recently got the RetinaNet to score an MaP of 0.49 on PascalVOC on a private notebook, should be possible with just the standalone package in the 0.4.0 release. I'd say give it a month.
​
The API surface won't change a lot, but the results will get a lot better in the next release.
johnnymo1 t1_ithghw0 wrote
I think there may be a name collision going on here haha. You mean the object detection API within KerasCV?
puppet_pals t1_ithgyr7 wrote
Oops - yes.
johnnymo1 t1_ithh4gp wrote
Thought so. Looking forward to using it on future projects!
IndieAIResearcher t1_ith113b wrote
JAX integration in TF inference, lite or JS, is fav!!
Lajamerr_Mittesdine t1_itgeiy5 wrote
Can someone that has both the perspective of using TensorFlow and using Pytorch give their opinions why you would / wouldn't use each?
And how this announcement changes things for you.
IndependentSavings60 t1_itgqugk wrote
I used TF back when deep learning taking off, around 2016. Personally, it began really hard to use when TF allow contribution modules to merge into codebase, inconsistent API interface and wall of deprecated warning appeared everytime I run code or update TF version. I think most of people back then use TF or Pytorch to try out research idea and TF just didn't work well for that role like Pytorch. More and more reseaechers switch to Pytorch for easy use and so are their new models, I have a doubt that there is still anyone published model on TF beside Google.
anomaly_in_testset t1_itgwh3d wrote
I am primarily a TF 2.x user and have started using Jax as most of my thesis. For me, this is a huge update since TF doesn't support item assignments, numpy like behavior etc. And XLA is a huge update as the speed difference in Jax and TF is significant. So overall, I am very excited.
Mefaso t1_itkp4he wrote
>And XLA is a huge update as the speed difference in Jax and TF is significant
Which one is significantly faster? I guess Jax?
anomaly_in_testset t1_itlgf25 wrote
Yes Jax is faster
fasttosmile t1_itiivv7 wrote
Interesting they've decided to keep investing into it. I suppose with the amount of existing code they must have it was hard not to.
learn-deeply t1_itikwmy wrote
They have to. If they created TF 3.0 without backward compatibility or told people to switch to JAX, every one who used TensorFlow would just move to PyTorch.
levilain35 t1_itrlfj0 wrote
I have tried tensorflow 1, tensorflow 2 and pytorch. I love tensorflow because it is very efficient for scaling big models on several GPU or worker. In addition to that, tfrecords are very cool, we never go back in python during training. It is a very important part in tensorflow even if pytorch have so good quality. For my applications i need to train very big model easily and deploying it. I am happy to hear that is a main goal of tensorflow
Purple_noise_84 t1_itgynxx wrote
Few more years and it will be as good as pytorch today :)