Submitted by bigbossStrife t3_z2a0xg in MachineLearning

For an ML project I have at work, I've been considering if I should build my pipeline for training and deployment using PyTorch only or use something like PyTorch Lightning instead. I like how easy lightning is to use and all the little automatic things it does on it's own, but I also like to know what happens in the background and being able to do specific things when needed, so if I end up spending more time reading any specific framework's documentation to understand how to do one little thing when I could already be making it work, I feel like it would be a waste of time.

So that's why I decided to go with the PyTorch only implementation, but the thing is as the project was going forward, I started implementing more and more things and I felt like I was redoing a lot of things that some frameworks already offer like calculating batch size automatically, early stopping, etc.

I was wondering what's the workflow of other people here and was curious to hear some opinions on this.

32

Comments

You must log in or register to comment.

Witty-Elk2052 t1_ixfeshp wrote

huggingface accelerate is a good middleground

20

suflaj t1_ixffok6 wrote

I was thinking te same as you for perhaps 3-4 years now and over this last year I have been regretting it. I could've saved so much time if I embraced it at the end. I'll learn how to use them eventually, peobably in the next few months.

9

linverlan t1_ixg4u3s wrote

This feels backwards. Best approach is to use the most off-the-shelf implementation you have available for a base model and implement specific features or refinements as needed for your use case.

This way you move quickly, get acceptable performance right away, and can make iterative improvements as long as time allows.

66

ryanglambert t1_ixg68tl wrote

It's all about your goals, just like everything else.
I built this tool in 5 months https://www.padex.io/articles/introducing-padex-patent-parser. I used about 5-6 deep learning models 4/5 of them "off-the-shelf"

The 1 or 2 models that were fully bespoke I learned good stuff while working on them.

I ended up learning quite a bit of pytorch while making those models.

For me, letting business needs be the driver of what I learn is most fun for me, it also seems to have worked.

1

evanthebouncy t1_ixggkxo wrote

Don't do things to prove to yourself you can do things. Have a goal and meet it with minimal efforts.

Self imposed difficulty 9 out of 10 times is because you haven't got a clear goal.

34

BackgroundChemist t1_ixgmjbl wrote

I think pytorch is a good level for understanding what's going on building networks but not knee deep in mathematics and fundamentals.

Even then there are quite a lot of prepackaged networks bundled into pytorch; from what I remember you can just instantiate AlexNet with one line.

2

londons_explorer t1_ixgndbj wrote

When you said 'avoid high level frameworks', I thought you were going to ask about building matrix operations with for loops in C++...

I'm here to say that compilers aren't as good as you imagine, and the built in intrinsics in pytorch will perform far better than anything you will write in C/C++/assembly (without months of effort), even before you get to GPU stuff...

12

Valdaora t1_ixgphvo wrote

In my projects, and in kaggle competitions, pytorch code is always the easiest and shortest part (yet obviously really complicated theoretically).

There's no reason to switch to another framework like lightning that will disappear in a couple of years.

1

tripple13 t1_ixgqyyi wrote

To be fair, I understand your motivation, I've had similar reservations.

However, the amount of boilerplate code I've been writing (DDP, train/evals, tracking metrics etc.) has been shrunk by a huge amount after switching to Pytorch Lightning.

When you are measured by your efficiency in terms of hours spent, I'd definitely argue for simplifying things, rather than not.

3

juanigp t1_ixgzh0t wrote

I was doing everything PyTorch and then I switched to lightning to accomplish my goal easier, and you still have room for "low level" (with 1000 quote marks) development.

I think that implementing feature X, and advancing with my research/work are two different, maybe equally exciting tasks, and keeping them separate is more productive. If you dont, then you would end up implementing something similar to a lightning/mmcv/etc clone and hey, they already exist!

2

Swimming-Pool397 t1_ixgzv2n wrote

I found PyTorch lightning to be a bit like using a batteries included ide (which I always do but some would argue against!).

I was able to very easily translate my existing PyTorch code into Lightning but I have also additionally learnt about other functionalities and ways of implementing ideas by using Lightning in a curious way.

1

ThomasBudd93 t1_ixgzy0h wrote

I've build my entire library in pytorch and was amazing. I don't regret it at all. If you have a long term project or want to learn more about coding and DL implementations, it is a good choice I think.

For the context: I knew the project that I was doing for the next few years in my PhD. I looked at the current SOTA implementations and didn't like certain aspects/saw a lot of work coming ahead if I was going to pursue with it. I read the code a lot and spend a single day just thinking about my design before writing a line of code myself. I learnt a lot in this time. If you do that you will get the chance to learn and think about all the little technical details hidden in current frameworks.

After having finished this, I was able to adapt my code quickly to new ideas I wanted to try. Also switching from messy iPython notebooks to my library made my work more reproducible.

This was my experience, but it will differ from case to case. I would say if you have a long term project, start you career and have the time, definetly do it! You will learn a lot and have a code basis you understand heads to toes and you can rely on. Otherwise I would reconsider this.

Hope that helps!

6

joey234 t1_ixh1qtp wrote

Hmm. I had a pretty bad experience with Lightning some time ago where I had to re-write my code a lot whenever they release a new update so I decided not to use it again. It probably is in a good and stable state now though. And besides, I think PyTorch itself is high-level enough for me.

2

alterframe t1_ixhmyr3 wrote

That's why it's so difficult to invest in something like Lightning. If you find a fine torch repository for your project you should go with it. You are not going to move everything to Lightning just because you are more comfortable with it.

On the other hand, Lightning is actually doing a decent job being modular, so it's mostly fine. TorchMetrics is a great example of how it should be done.

5

alterframe t1_ixhvkgo wrote

It all boils down to how would you behave when something goes wrong. The weights of your layer do not converge? Try some more or less random hyperparameter changes and maybe they finally will. Sometimes that's the only thing you can come up with. Frameworks are just fine for that.

Maybe you have some extra intuition about the problem and want to try something more sophisticated to probe the problem better? You'd be fine with a framework as long as you deeply understand how it works, because the change you are going to do may be outside of its typical usage. Otherwise, you'd just get frustrated when something doesn't work as you expected.

I get the sentiment against using high-level frameworks. At the beginning all of them look like toys for newbies that compete with each other in making the shortest MNIST example. However, as more and more people use them they are more and more refined. I think that at this point Lightning may be worth giving it a try. I myself, would be strongly against it few years ago, and I was quite annoyed with its rise in popularity, but ultimately it turned out to be sort of a standard now.

1

leondz t1_ixi3xse wrote

A little bit, yeah. Completely ignoring some of the tools that you have at your disposal, limits your power and efficiency.

1

sog_unit t1_ixi6ekm wrote

As it is for work choose a framework that will allow you to easily iterate over ideas quickly without having to reinvent the wheel. For hobby projects you can mess around and enjoy the learning. You are better off spending your time on the problems that have a business impact.

I am using lightning for a research project and one thing I like about it is that, if my code runs locally, then I am almost certain that it will run on the cloud. If you need more control there is lightning lite that allows you to write custom loops while it handles much of the device stuff for you.

Whichever method you go with have fun and enjoy the learning process.

1

arnowaczynski t1_ixjb5k2 wrote

Stay in PyTorch if you write good quality code and like explicit control over it.

1

Direct_Ad_7772 t1_ixldok1 wrote

I wonder if the usefulness of Lightning depends on your use case. Very standard tasks may benefit from Lighting, while a very exploratory PhD project may benefit from the more explicit control you have in plain pytorch. Could that be the case?

Basic pytorch has served me well in my PhD. I can quickly set up a multi-GPU project, e.g., for training GANs or semantic segmentation, and track metrics. The multi-GPU boilerplate is not that much, and learning how Lightning works on the inside (in the end you need to know this anyways) seems like more effort.

1