view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
what's is ducktyped
In most programming languages data has a type. You can think of it as the ‘shape’ of the data. For example: you have an ‘integer’ type which can contain whole numbers, a ‘floating point’ type which can contain fractional numbers, a ‘string’ type which can contain text, etc.
These simple types are called ‘primitive’ types, they are built into the language itself.
Usually, you can also define your own types, which are made up of the primitive types, for example: you can have a ‘car’ type:
Meaning any piece of data of type ‘Car’ has a maximum speed, number of passengers, brand an model.
Now languages like Java have so called static typing. You declare something to be a car, and the compiler knows that a car has these properties. This is used in a lot of places. Variables have types, parameters passed to a function have types. The return values of functions have types, etc. Static typing means you always know exactly what type of data you are dealing with.
Ducktyping is the exact opposite. The term comes from ‘if it walks like a duck and quacks like a duck, it’s probably a duck’. In a ducktyped language you don’t declare what type something is, you just use the data as if you know what type it is, and if it’s the correct type, it works. If it’s not, things might break.
In a statically typed language it is impossible to use the wrong type, this is enforced by the compiler. In a ducktyped language this is impossible, as the types are not declared. So if a function expects a ‘Car’ as a parameter, and you pass in a ‘Horse’ instead, in Java you would get an error when trying to compile this code. In Python it would just run it. This may be fine. A horse may also have a maximumSpeed and if you try to read that it would work. But when you try to access something a Horse doesn’t have, like the ‘brand’ property, things go tits up.
The main problem with this is that you only see this if you happen to run that specific bit of code in that specific situation. It may never happen during testing. Worse, if you change anything in ‘Car’ you don’t necessarily catch all the problems this causes. Say you rename ‘numberOfPassengers’ to ‘passengerCount’, in Java any code that still tried to access ‘numberOfPassengers’ would fail to compile, you’d immediately get an error. In Python you wouldn’t spot this problem at all until it’s too late.
The advantage of ducktyping is that it’s less verbose, you can whip something together quickly without having to think too much about the types. For a small simple program this is perfect, but the larger your codebase gets the harder it becomes to manage this. You can’t oversee the whole codebase anymore, mistakes happen and the compiler won’t catch them.
You can mitigate it a little, for example by writing lots of automated tests, but you really shouldn’t have to. A static type system prevents a lot of dumb mistakes from being made.