thedarklord176 OP t1_j7j35wu wrote
Reply to comment by The-Last-Lion-Turtle in Wouldn’t it be a good idea to bring a more energy efficient language into the ML world to reduce the insane costs a bit?[D] by thedarklord176
But isn’t everything in Python from C? By that logic I’d think that would make no difference because it’s still Python. Not saying you’re wrong, I don’t work in AI I’m just curious
username-requirement t1_j7j5ihu wrote
The critical factor to consider is whether the computation spends time in the python code or C/C++.
Many of the python language constructs are quite slow, and this is why libraries like numpy exist. The program spends relatively little time in the python code which is merely acting as an interpreted, rapid-to-modify "glue" between the compiled C/C++ library functions.
In the case of tensorflow and pytorch virtually all the computation is being done in C/C++ and python is basically acting as a highly flexible configuration language to do setup.
currentscurrents t1_j7j78vc wrote
All the computation is happening on the GPU. Python is just making a bunch of calls to the GPU drivers.
Researchers spend a lot of time making neural networks as fast as possible. If switching to another language would have given a substantial speed boost, they would have done it already.
blacksnowboader t1_j7j6y2y wrote
The answer is sort of but not really. A lot of the common packages in ML And data science are in python, but the computations happen in C/C++, Fortran, Scala and others to name a few.
Viewing a single comment thread. View all comments