Hey there!
I'm a chemical physicist who has been using python (as well as matlab and R) for a lot of different tasks over the last ~10 years, mostly for data analysis but also to automate certain tasks. I am almost completely self-taught, and though I have gotten help and tips from professors throughout the completion of my degrees, I have never really been educated in best practices when it comes to coding.
I have some friends who work as developers but have a similar academic background as I do, and through them I have become painfully aware of how bad my code is. When I write code, it simply needs to do the thing, conventions be damned. I do try to read up on the "right" way to do things, but the holes in my knowledge become pretty apparent pretty quickly.
For example, I have never written a class and I wouldn't know why or where to start (something to do with the init method, right?). I mostly just write functions and scripts that perform the tasks that I need, plus some work with jupyter notebooks from time to time. I only recently got started with git and uploading my projects to github, just as a way to try to teach myself the workflow.
So, I would like to learn to be better. Can anyone recommend good resources for learning programming, but perhaps that are aimed at people who already know a language? It'd be nice to find a guide that assumes you already know more than a beginner. Any help would be appreciated.
Odd take imo. OP is a programmer, albeit perhaps not a very good one. Did a PhD (computational astrophysics), been working as a professional dev for 10 years after that. Imo a good programmer writes code that solves the problem at hand, I don't see that much of a difference between the problem being scientific or a backend service. It doesn't mean "write lots of boilerplate-y factories, interfaces and other layers" to me, neither in research nor outside of it.
That being said, there is so much time lost in research institutes because of shoddy programming by researchers, or simply ignorance, not knowing a debugger exists for instance. OP wanting to level up their game would almost certainly result in getting to research results faster, + they may be able to help their peers become better as well.
25 years in the industry here. As I said there's nothing against learning something new but I doubt it's as easy as "leveling up".
Both fields profit a lot from experience and it's as much gain for a scientist do become a software dev as an architect becoming a carpenter. It's simply not productive.
Well, that's the way it is. Scientific code and production code have different requirements. To me that sounds like "that machine prototype is inefficient - just skip the prototype next time and build the real thing right away."
I don't think you understand my point, which is that developing the prototype takes e.g. 50% more time than it should because of complete lack of understanding of software development.