Hmm, so kinda O(n1.5) scaling? (Of the ratio between definitely required time and possibly required time, anyway, since a -110% error wouldn’t make sense)
Formerly u/CanadaPlus101 on Reddit.
Hmm, so kinda O(n1.5) scaling? (Of the ratio between definitely required time and possibly required time, anyway, since a -110% error wouldn’t make sense)
Taking a wild guess at the source of the confusion, I should be clear that I love Haskell. It’s great for a lot of what I personally end up coding, namely math things that are non-heavy by computer standards but way too heavy to solve by hand. This isn’t naysaying.
I mean, you’re not going to be using an SQL database most likely for either of those applications (I realize I assumed that was obvious when talking about transactions, but perhaps that was a mistake to assume), so it’s not really applicable.
To be clear, I was introducing two new examples where I think this problem would come up. It could be that I’m missing something, but I’ve had this exchange a few times and been unimpressed by the solutions offered. The IO
in those cases could get pretty spaghetti-ish. At that point, why not just use a state?
Like, using a list, which is a monad, you could code a Turing machine, and it could have a tape specifying literally anything. I can’t imagine that one would ever come up, though.
Ironically, I actually probably wouldn’t use Haskell for heavy data processing tasks, namely because Python has such an immense ecosystem for it (whether or not it should is another matter
It certainly is, haha. If it’s heavy Python is just calling Fortran, C or Rust anyway.
I 'member.
So what’s the actual error margin for estimating feature implementation time? It’s going to be nearly the whole thing, right?
I’m not sure what you mean by “locality of reference”. I assume you mean something other than the traditional meaning regarding how processors access memory?
Shit! Sorry, got my wires crossed, I actually meant locality of behavior. Basically, if you’re passing a monad around a bunch without sugar you can’t easily tell what’s in it after a while. Or at least I assume so, I’ve never written anything big in Haskell, just tons of little things.
To give a concrete example:
Yeah, that makes tons of sense. It sounds like Transaction
is doing what a string might in another language, but just way more elegantly, which fits into the data generation kind of application. I have no idea how you’d code a game or embedded real-time system in a non-ugly way, though.
It also has a type system that is far, far more powerful than what mainstream imperative programming languages are capable of.
Absolutely. Usually the type system is just kind of what the person who wrote the language came up with. The Haskell system by contrast feels maximally precise and clear; it’s probably getting close to the best way to do it.
Yeah, no side-effects seems like it could only improve readability.
It is, although I’m not sure it’s complete. A list is one kind of monad, despite working like non-mutable linked lists would in any other language. They just happen to behave monadically, providing an obvious and legal interpretation of the monad functions. Going off of OP you might think monads are all Maybe
.
I will say that the concept is overhyped in at this point, at least in Haskell, and there’s a lot of monads available that do what plain functional code could but worse.
That’s a good run down of the “why”. The thing is, there’s way more things that are monads than things that have to be looked at as monads. AFAIK it only comes up directly when you’re using something like IO
or State
where the monad functions are irreversible.
From the compiler end, are there optimisations that make use of the monadic structure of, say, a list?
Whatever Haskell programmers decide to call a monad today. It’s wandered pretty far away from whatever mathematical definition, despite insistences to the contrary.
(Technically, the requirement is to implement a few functions)
It heavily depends on the application, right? Haskell is life for algorithmically generating or analysing data, but I’m not really convinced by the ways available in it to do interaction with users or outside systems. It pretty much feels like you’re doing imperative code again just in the form of monads, after a while. Which is actually worse from a locality of reference behavior perspective.
I also wonder about acoustic signatures on these supercruise 6th gen models. Sure, you spend less cumulative time behind enemy lines, but microphones are cheap and sonic booms are nicely all-at-once and loud. Especially if you’re flying nape of the Earth for the additional radar and IR/optical cover.
NG decided not to bid, actually, so it was a binary choice.
Y’know, from what I’ve seen of the F-35 it’s an amazingly sane machine considering the requirements were very much not sane.
Political convenience is a big one with them. At least with the passenger airplanes, their only competitor is very non-American Airbus, so they’re not only too big to fail but too embarrassing to fail.
I wouldn’t be surprised at all if in 10 years their liners regularly fall out of the sky, and the US government just covers it up domestically the way China does with a lot of their corner cutting.
That being said, I can’t rule out that their bid wasn’t just better here, at least on paper.
An actual compsci professor would know real CPUs don’t run arbitrary recursion, right? Nobody could possibly be that siloed.
Yes, the terror of nuclear annihilation will be significantly tempered by the hilarity if he tries taking on Europe.
Okay, that sounds sarcastic, but I think I actually mean it. He will get rekt and it will be satisfying. Then I’ll turn to figuring out nuclear winter in fucking Canada.
Eh, the rest of the Western world is slowly getting there.
We’ve all noticed this, right? Some of it is just age, but my dad ballooned just after he married and I doubt it was a coincidence.
You know, I actually wonder if there’s a universal law at play here. Maybe the same budget and the same ability to project accountability always gets the same return. To fund every good idea in a short period you have to fund a certain amount of bad ones.
Well, that’s maddening.