Interests: News, Finance, Computer, Science, Tech, and Living

  • 0 Posts
  • 45 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle

  • People use Python a lot as a Matlab, Excel/VBA, or R alternative. That was my use for many years. Some of these are compute focused problems and if the dataset is large enough and the computations complex enough then speed can be an issue.

    As far as loading packages and printing. Who cares. These are not computationally intensive and are typically IO bound.



  • Same for me. I have used Python for most things since the late 1990s. Love Python. Have always hated the poor performance… but in my case mostly it was good enough. When it was not good enough, I wrote C code.

    Python is good for problems where time to code is the limiting factor. It sucks for compute bound problems where time to execute is the limiting factor. Most problems in my world are time to code limited but some are not.

    Python compute performance has always sucked.



  • Reason you do not need Typescript for Python is that it is a real language. JavaScript was a crap extension language that people have been trying to get around forever with preprocessors…

    As far as needing types… One of the big advantages of Python is not needing types. I have used Python for 25 years and never used types or missed them.

    What I do occasionally miss is speed. That is a combination of lack of typing and crap implementations and there are various ways around it.



  • flatbield@beehaw.orgtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    You get an IT staff that is MS and Windows certified, what sort of answer do you expect them to give? As far as IT staff where I worked, they often had issues with resolving Windows problems say nothing about Linux. Generally for Windows, I had to get to level 3 support before they knew anything. Even then I often had to tell them what needed to be done rather then them actually knowing. Some of this is lack of skill, some if it is under staffing, some of it is restrictive processes, and some organizational issues. You had to know how to work the system on one hand, and which issues just to not waste time on. Not saying they did not try hard, but without facilitation their results were often insufficient.

    That does not mean you cannot use Linux however. Just means the main IT group does not support. We had a separate group that ran the Linux compute cluster we used. I also typically always had a Linux VM on my workstation too to use FOSS tools. Not sure that would be allowed these days since IT has gotten nuts about security, and with that they have generally grabbed a lot of power regarding what can and cannot be done on “their” hardware and on “their” networks. You can also get exceptions to a lot of those rules if you can justify it and if your management is willing to run it up the flag pole. If not, your working for the wrong people.


  • Pick and choose. I actually like most of the Python Doc. Learned Python originally from their tutorial. Then learned key parts of the library. So I like those two documents. The other docs though can be deep. The language reference for example. Never read that except parts.

    I also had a book about Tkinter and another about Win32 Python programming. So I learned from those too. My first app was a data acquisition too with a Tkinter GUI. So I think a few books are good but maybe people do not do that now.

    For me, learned Python in a day mostly from their tutorial and the standard library reference, then it took me the next 9 months to actually get good at it. Then still learning stuff 25 years later. I did have an advantage. I had been programming for 20 years before I learned Python and had used half a dozen other languages.





  • Yes, there are a lot of assumptions, incorrect information, or at least miss-leading stuff out there. So I am always interested in learning more about easy and hard ways to make things better. In fact for most things I do, Python is fast enough, but sometimes it is not.

    The things I find miss-leading about what people often say about Python are that it is not that slow, and that you can always just use a library like numpy or something similar to solve speed issues. I found both to be more or less untrue in the sense of getting C like speeds. On my code, Python was indeed slow, like 1% of C speed. The surprising thing for me was using numpy helps a lot but not as much as you think. I only got to 5 to 10% of of C speed with numpy. This is because libraries are often generically compiled and to get good speed you really have to have C code that is compiled for your specific hardware with vectorization, autoparallel, and fast math at least. So generic libraries just are not going to be that fast. Another one people push is using GPUs. That also is not really very effective unless you have a very expensive card and most notably a dedicated GPU card design just for that or an array of them. The GPU performance of my workstation is significantly less then throughput of my CPU. There are hardware limitations too that are interesting. My AMD Rizen 7 based workstation would have twice the speed if I had 4 port memory rather then two port memory which is a lot more common since fully optimized code is memory IO bound at about 1/2 the CPU throughput. This must be why AMD Rizen Threadrippers seem to use 4 port memory.

    There are ways around a lot of this. For example using numba can be incredible. Similarly writing your owe C code and carefully compiling it too. The careful compile is critical. Maybe one could do the same with some stock libraries, carefully compile them. Lot of the other stuff people talk about just does not work very well in terms of speed or effort such as pypy, cython, nuitka, etc.



  • For what it is worth, my take on the article. A really over whelming list. Nice read through but for those that are interested, the most useful components that were discussed were probably:

    • CPython. Of course, that is what we all use.
    • PyPy. This is an interesting acceleration if you do not need things like numpy and a lot of other common libraries. The acceleration is maybe 9X in my experience. However C code or good use of numba can often get 100X.
    • MicroPython. Not tried but seems cool if you need a really small Python. Presumably not exactly compatible because of missing libraries.
    • Pyston. Have not tried but seemed interesting from their discussion of the “pyston_lite_autoload” thing. Have no idea if it is useful.
    • Cython. Lot of hoopla about this. Good software but my experience is that you do not get much for speedup until you statically declare stuff. When I did that I got about 24X, then playing with prange and openmp features I got 75X. Not a bad speed up. However, it does not look so good when compared with writing C code or using numba. Mainly because those speedups using other methods seem to be easier to get and I got as large as 121X when using them instead. Cython is just complex to use and then does not get your full entitlement with respect to speed, or at least that was my experience.
    • Numba. Numba and Numpy used in the correct situations can give 121X speed improvements and performance similar to parallized and vectorized C code. Actually for some reason it was faster then my C code. This combo is super. Everyone should know about Numba.
    • Nuika. Very handy deployment tool. My experience same speed as CPython basically. Well I got about a 9% improvement which is almost nothing. So do not be fooled into thinking that it will give you big speed improvements. A very nice tool as part of your packaging and deployment process.

    Since I talked about C code. There are three ways to integrate C code into python: ctypes, CFFI, and using the standard C extension method. I found ctypes to be about 107X, CFFI 108X, and the standard method about 112X for my code on my hardware with code which was using autoparallel and autovectorize, fastmath, and maybe other settings. My point, the speeds of these are about the same though the standard method is just a little faster. So you can really pretty much do whichever is easier.

    Anyway my thoughts. Hope they make some sense.



  • Wow! Why are you so threatened by the 4 freedoms? No one is telling you what to do with your own code. It is rather astounding to be condemning the FSF for an agenda when you clearly have one of your own.

    As far as companies and other actors buying power, that is everywhere. IP laws well that more depends on if the country wants to exploit IP or freeload on other people’s stuff. I cannot say I feel strongly either way.


  • I will say something that may appear counter to what I said. One advantage of permissive licenses is that they are more flexible and compatible over time. Also certain licenses become the thing in certain communities. It also depends on who you want to use your code or library. Most corporations will not use code that is any more restrictive then LGPL. So though GPL and LGPL are interesting if they work, they do not always make sense. It is also why GPL and LGPL should be applied as version 3 or later, not just version 3 so the license can be modified to remain compatible. The other option is to assign your copyright to the FSF so they can make any modifications.


  • GPL compliance is not voluntary. Legal validity has been proven many times. Specific jurisdictions that would require a lawyer in that jurisdiction to give an opinion. The FSF though did look at general treatment of copyrighted works around the world when writing GPLv3. Keep in mind too, that what people do versus what is technically legal varies widely. More then that, global corporations or corporations that are part of a global supply chain will probably have to comply with the rules in the most restrictive jurisdiction. The copyright owners can choose to sue at any time if they find out about a violation. It is of course more likely to occur when a pattern of abuse is found.