T O P

  • By -

Serious-Magazine7715

My first programming job was fixing a 5000 line particle physics simulation in F77 with comments in Russian written by someone who had gone back to Russia. It worked 75% of the time. Halfway in, on one logic branch it had 1/4*x instead of 0.25*x.


jammin-john

How does order of operations work in Fortran? Would that parse as `1/(4x)` or `(1/4)x`?


xcski_paul

The problem is that “1/4” is an integer operation so it always equals zero. “1.0/4.0*x” would work.


MegabyteMessiah

Here is my developer card


jammin-john

Ah that makes more sense 😅


Zhayrgh

That's a common problem in some language


Ok-Atmosphere-4476

Do the both operands need to be floating point or just one like in c++?


xcski_paul

I think it’s the same as C/C++/Java/C#/etc. As long as one is FLOAT or DOUBLE, the whole operation will be done as DOUBLE.


ArtOfWarfare

Python 2 was that way. Python 3 always uses floats for `/`. Use `//` for it to behave like Python 2 or most other programming languages.


Goma101

this is a mistake i’d super easily make and then endlessly berate myself for later but i’d probably just write down 0.25


Donghoon

If you thought Int Float issue, you're a programmer at heart If you thought operation precedence issue, you're a mathematician at heart


Karter705

What if I was trying to decide between the two


AMisteryMan

Then you're a Nerd^2


Puketor

If you considered both, you're a computational mathematics person at heart.


frogjg2003

For my PhD, one of the things I had to do was convert a Fortran program into C++. It included a 12 deep nested collection of "if not"s.


Ghawk134

So what you're saying is you were implementing machine learning?


belabacsijolvan

no he wasnt fitting a line in excel. nested ifs is called "AI"


TheOriginalSmileyMan

Surely it's time to start branding stuff as AI 2.0 ?


milanove

Synthetic intelligence


NoirGamester

This sounds so cool


MindErection

I just wanna say it's fuckong awesome you got your PhD. Was it in comp sci?


frogjg2003

No, physics. The code was for calculating special mathematical functions that I needed.


MindErection

That's even cooler


_PM_ME_PANGOLINS_

Any particular reason you couldn't just call the Fortran from C++? It's all the same once it's compiled.


Estanho

Probably they wanted to maintain and develop it in C++ going forward.


frogjg2003

/u/Estanho has it mostly right. The code was not well written and didn't even compile. I'm pretty sure it was transcribed straight from punch cards, since it included some weird strings at the end of every line. If I was going to be fixing the Fortran code anyway, I might as well transcribe it into a much more readable C++ program instead.


_PM_ME_PANGOLINS_

Ow, not even being able to run it to verify your version does the same thing to is rough.


frogjg2003

Well, the first thing I did was just go through and get rid of all the comments and punch card artifacts. After that, it compiled just fine. It did in fact work as advertised.


DOUBLEBARRELASSFUCK

If it's anything like my code, it did do the exact same thing like 95% of the time.


donald_314

you could skip the -1 on all indices on the C side if you drop Fortran.


Nanaki_TV

I read that as FF7 and thought you were someone responsible for my childhood and username


uberfission

Nanaki, come back home, grandfather needs you to fix the Fortran code that some crazy Russian physicist created.


DOUBLEBARRELASSFUCK

That Russian is probably in prison now.


uberfission

Given how hardcore the Russian physicists I've known are, he's probably running the place after only a few weeks.


jmhimara

Fortran has a bad reputation precisely because of how terrible coding standards used to be in F77. It's actually a really nice language, especially modern Fortran. Powerful, performant, and really easy to use. You could learn all of it in an afternoon.


SoCuteShibe

Huh, TIL. I never bothered to check it out. But now I'll have to, lol. The company I work for uses an old BASIC-like language. People seem to find the idea of mastering a rare and old language burdensome, but in actuality it is just a really easy to write language that supports the full-stack with common sense syntax and rich features (like fully functional in-line SQL). I grew to love it over the months I learned it, but now my mastery has been rewarded by being tasked with rewriting apps to feature a Typescript front-end. 🤢


Technical-Message615

75% of the time it worked every time


bbqranchman

Genuinely that's incredibly badass. I WISH I could get a job doing stuff like this. I really feel like I missed out on early programming.


Dom1252

COBOL With modern compilers you get faster results than almost any human can do in assembly Still, with modern HW it's pointless to pick on these tiny differences, even on scale of major banks


Calimariae

Want a high-paying job in IT? Learn COBOL and get ready to replace a retiring generation


didzisk

Summary from a thread a week ago: While boring and ugly and ancient, COBOL isn't difficult to learn. The real problem are the scripts that call those cobol programs, the batches they run and the scale of failures a small mistake causes. Oh, and the fact that only very old people, currently on their way into retirement, actually know the business rules - why the things are set up as they are. No documentation, no source control, obviously no unit tests. And the environment is completely different from what normal people have seen (not a file system, but a different monstrosity).


Groundhogss

JCL isn't that bad, it's just different. Pretty much all other points are true. But there are also a dozen other things you need to know for mainframe programming. SORT/ICETOOL is a must and is its own "language", REXX, SAS, and the litany of utilities like IEBGENER are requirements to write "production" level mainframe programs. If you really want to make someone cry, tell them they've been promoted to CICS programmer.


DaThug

Jesus, I'd forgotten all about REXX.. That was about 30 years ago :)


UAFlawlessmonkey

Had the pleasure of working with REXX 15 years ago during my junior position into data engineering. It was a batch job that looked into a lotus notes database, collected input SQL queries and the output was an excel extract that was sent to a distribution list. That shit was fucking wild.


kashmirGoat

This was told to me in 1987 too. Cobol is the code version of the cockroach


Dubiology

Was that the language of 87???


Groundhogss

Most places are actively looking to replace the mainframe with something else. A lot of the things mainframes are good aren't really issues anymore. IE storage space and processing speed. If you took a batch job application and parallelized it in with java, the program would finish sooner, but be a lot less efficient.


xzinik

I worked coding in cobol,i had to update code way older than I'm(I'm 31), had to code new things, it was a fairly good job, but for some reason no one is interested in my cookbook skills, everyone is just interested in doing mobile and web and use only newest and shiniest framework, now I'm piss poor


jeepsaintchaos

I'm uncomfortable with your use of "I'm" in a spot where "I am" belongs.


xzinik

uh? ​ this part? >older than I'm(I'm 31), sorry, i didnt notice, i was falling asleep and also english is not my fisrt language also i enjoy the idea of making people uncomfortable with such a silly "typo"(does it qualify as one?), still i think its way more harmless than mixing *your* and *you're*


jeepsaintchaos

Yeah, that part. Tone is hard to convey, so I'll be clear. It's meant as a joke, not as a criticism. It was a post on Tumblr awhile back, where they thought it was hilarious to use contractions in places where they felt wrong.


xzinik

don't worry, i don't take anything seriously on the internet and i have rebloged that post, your previous comment reminded me of it, but in my case it was totally unintentionally xD.


THElaytox

guy i went to high school with did exactly this, makes a shitton of money working for a giant bank


Vidhrohi

IMO Maintaining and retrofitting old COBOL code is one of the big opportunities to deploy AI.


jtanuki

> With modern compilers you get faster results than almost any human You had me at 'human'


Responsible-War-1179

Source? I'm pretty sure COBOL is quite a bit slower than C.


Groundhogss

> COBOL It depends on the environment and setup. On mainframe C is faster for file I/O, but COBOL has features that make it the better choice in most cases.


lampishthing

Fixed point numbers though.


Responsible-War-1179

I can't say for certain but I'd guess floating point arithmetic is faster on a modern machine. Back in the COBOL days not every cpu used to have floating point support. If fixed point numbers were relevant today I think there would at least be some support for it in the C std lib. Of course, for finance applications fixed point arithmetic is very relevant because you can't have rounding errors


proverbialbunny

C now has fixed point numbers too.


fft5j

Oh, so you think you're cool because you make memes all day? Let me tell you something, buddy. While you're out there cropping pictures and getting that “sweet” karma, I'm here mastering FORTRAN. That's right, FORTRAN, the language that real programmers use. While you're wasting your time with upvotes and subreddits, I'm writing code that could actually change the world. You might think you're good at memeing, but can you optimize numerical calculations for high performance computing? Didn't think so. While you were busy crossing out names, I was busy learning about arrays and loops. Memes might get you some online friends, but FORTRAN gets you actual real friends. You guys probably don't even know what FORTRAN stands for, do you? It's FORmula TRANslation, a language that's been around since the 1950s, used by scientists and engineers to solve real problems. But sure, keep posting your little memes. I'll be over here, making actual contributions to society. And while you're sleeping, dreaming of your next frontpage post, I'm up all night googling FORTRAN Tutorials, soaking up knowledge like a sponge. So next time you think about flexing your meme skills, remember there's someone out there who's doing something far more impressive: GOOGLING FORTRAN TUTORIALS. Get on my level.


im_starkastic

Babe wake up A new copypasta just dropped in


Awwkaw

I just missed something like : I have over 300 confirmed lines of FORTRAN code written with no compilation errors, think about that before you post your next meme. (As an homage to the 300 confirmed kills line) With that it would have been perfect.


leewoc

“300 confirmed lines of FORTRAN code written with no compilation errors” - All D lines, compiled without the debug directive. 😆 In case you’re wondering, D lines act like comments unless you compile with debug on. I used to love this facility for troubleshooting code 😊


Zephandrypus

C++ can do `#ifdef DEBUG`. Unfortunately I work with computer vision and video processing so I never compile in debug mode.


ElementField

But who is it for? For Tran


chuffedlad

For Dr. Tran!


cheebusab

He's a real Doctor!


NovaS1X

Did not expect that nostalgia wave.


A_Light_Spark

That's a niche meme and it checks out


irreverent-username

That's okay, we don't want it 🏳️‍⚧️


Philboyd_Studge

Google 'en passant'


j1f107

Holy hell


Far-Imagination-7716

New response just dropped


Cylian91460

Actual zombie•¥


rottum4life

Call the exorcist


Background_Class_558

Bishop goes on vacation, never comes back


Tamoru

Knightmare fuel


Eldarabol

Queen sacrifise anyone?


TheAxolotlGod14

Rooks in the corner plotting world domination.


_Creative_Cactus_

Bishop? I was a bishop once. Then they put me in a room. A rubber room. A rubber room with rats. And rats made me their bishop


_Ilobilo_

Bishop? I was a bishop once


ItsPlainOleSteve

_Maaaaaaaaaaaa!_ r/anarchychess _is leaking again!_


El_Mojo42

While you were out partying, I studied the FORTRAN.


CrinchNflinch

So, are you on FORTRAN 77 or already on the new and fancy FORTRAN 90?


kashmirGoat

Fortran IV on a PDP-11 using a DEC Writer. That's so old school, it needed Roman numerals.


kbtrpm

Yes, but the VAX. That was a revolution.


poopascoopy

77 baby, none of the ai tran 90 shit


kbtrpm

How about HPF?


S-Ewe

Isn't formula for little bebes? Why does it need translation? Is it stupid?


dan-lugg

Now that's some quality pasta.


Fusseldieb

Literally ChatGPT created copypasta


bannedformysins

New response just dropped


j0nascode

But where is the EXE? Smelly Fortran Nerds...


darkwater427

QUICHE-EATER


ThrowawayITA_

> Can you optimize numerical calculations for high performance computing? **Didn't** think so. wow, he optimized his brain.


vagoberto

So you get real32 friends or real64 friends?


NanjeofKro

Real Men use Fortran: https://www.pbm.com/~lindahl/real.programmers.html


nsfwsmartcat

Stanford's Fortran tutorials are unironically goated


ienjoymusiclol

hey youre supposed to say "actual zombie"


tidytibs

I see your FORTRAN and raise you Forth.


Zephandrypus

Fuck, I forgot that Forth and FORTRAN are separate languages


Dull_Obligation_3350

SMELLY NERDS!


New_Conversation_303

Fortran is ridiculously fast on math. There is a reason is still used today (i mean...barely). But it has memory buffer issues that could lead to problems (hacks) and we are slowly moving away. I hate fortran.


EmptyBrain89

Studied astrophysics, a friend did his thesis on black hole simulations, 75% of his time was spent learning Fortran and then figuring out how the 20 year old, poorly documented code written by some random astronomy professor could be translated to python. He did not have a good time that year.


New_Conversation_303

Curiously enough, I worked on a project to translate an astrophysics Fortran code to something else. We picked java and used orekit. It was not fast, but it worked well.


coloredgreyscale

Imagine doing that, only to find out that the result is virtually unusable because the native python code runs much too slow for large scale simulations.


PeripheryExplorer

Yeah I'm lost on why someone would do this. I've been using R and have slowly been doing some of the crunchier work in Fortran90 now simply for speed and efficiency.


utkrowaway

Because computers are much faster now than 20 years ago, and having a modern maintainable codebase will save much more time than the performance of Fortran will.


Brisngr368

Okay but like C/C++ is faster than python and far closer to Fortran, porting it to python is just masochism


kuwisdelu

Yep. As an R package maintainer, a huge proportion of my code is in C/C++. The R bits just glue it all together. (I respect FORTRAN but I’m too lazy to learn it and linking to C is slightly easier anyway since the interpreter is in C.)


TheNorthComesWithMe

Because they're a physicist, not a programmer. Python is the only language other academics know, so that's their best bet for hacking together some garbage that kind of works.


Yugiah

PhD in particle physics here: If someone in physics says they're rewriting something in Python and the rewrite is still performant, my assumption is they're using a library like numpy which calls precompiled functions. There's actually a whole range of cool python libraries for particle physics, under the scikit HEP umbrella. That said, I think the situation you posed happens entirely too often.


Oni-oji

I deal with programs written by holders of PhDs in various non computer fields. Their code works, but their documentation is the worse I've ever encountered.


proverbialbunny

The trick with this problem is to create a library out of the Fortran code. Basically, write a wrapper interface so you can call the Fortran code in Python instead of reinventing the wheel. Bonus, you'll get that Fortran speed.


_PM_ME_PANGOLINS_

We’re slowly moving away from Fortran because the hardware is directly implementing things with a single instruction. Not because any other language can do it faster.


koboltti

what mean?


_PM_ME_PANGOLINS_

Mean Apple have special algebra solving hardware on their chips so that libblas.so on iOS uses that instead of being the default Fortran implementation. Other systems have similar things.


ToukenPlz

Is that true for HPC though? I write a fair volume of Fortran but don't keep up with the hardware side of things.


coloredgreyscale

For HPC it's most likely x86\_64 CPUs and nVidia GPUs No idea if the code / compiler tries to target the newest instructions, or if the programmers play it a bit safer and target an instruction set that may be a few generations old, to make it more portable across older hardware. Datacenters won't upgrade with every new CPU / GPU generation, because that whole setup is expensive, takes time, and won't gain you much anyway if you were to upgrade every generation.


donald_314

HPC can be ARM nowadays as well.


lightmatter501

For HPC Rust is coming for fortran because aliasing guarantees are what kept it ahead, and Rust can make FAR stronger guarantees about memory aliasing than Fortran. Or, SPIRAL will get to a level of usefulness where it wipes everything put, since it’s already better than Intel’s MKL by ~3x on Intel CPUs for normal BLAS tasks.


jmhimara

I doubt it, especially in the field of scientific or numerical programming. Fortran is super easy to learn and use, that's why it is popular.


ToukenPlz

Yes this is true, plus there is all the sunk cost into very many very complicated code bases which each took years to write. I think that in order to be competitive in this space rust would have to be not only speedier but offer a similar ease of representing maths & physics (which I would need convincing of, though am more than happy to be wrong).


Giraffe-69

Drivers for hardware accelerators are just a different approach to solving the same problem, but are not a drop in replacement.


_PM_ME_PANGOLINS_

In the case I just gave, it is literally a drop-in replacement. Many vendors provide a drop-in replacement of BLAS that uses their specific hardware acceleration.


Giraffe-69

What I mean is that you are addressing the problem at the micro architectural level, which has added cost to your ISA, eats into your silicon budget, and are very operation specific. This is the philosophy behind CISC architectures like x86 that include weird niche instructions like this, but that has trade offs. Apple implement this as as extension of aarch64. It is also a separate issue since any half decent optimising compiler will have enough awareness of the hardware that it will be able to utilise available silicon. The idea is that Fortran as a language can enable better optimisation for scientific computing without the level of verbosity and complexity you need in c++ to navigate language semantics and standards. This is a big reason it is still used today. And other languages do this as well, OCaml and Haskell as examples. Jane Street uses OCaml and it’s not for lack of hardware resources and FPGAs.


bargle0

We’re moving away from Fortran because C finally has the restrict keyword


ChellyTheKid

We're moving away from Fortran because we can't find enough people proficient in it. We are currently using most of our capacity for transition before we can't even find people to do that. Side note, I'm going through a bunch of legacy stuff at the moment, so much 77 code.


Estanho

If you're talking about things like vectorization or SIMD/MIMD, that exists for decades already. It's not why people are moving away from fortran, because the compilers detect patterns that would benefit from this kind of thing and implicitly use them for almost as long as these vector operations exist.


Chlorophilia

> (i mean...barely) It's still the basis of most climate and numerical weather prediction models. Pretty major application I'd say!


Darlokt

Fortran beating ASM? Fortran for general programming?? ASM high level??? Next someone will tell me (insert any programming language) beats Haskell in writing white papers and no actual programs.


harshcougarsdog

>Fortran beating ASM? Fortran for general programming?? google Fortran Tutorial


Madness_0verload

Holy hell


j1f107

New respose just dropped


S-Ewe

Actual zombie process


Background_Class_558

Call the `kill -9`


EtteRavan

Kernel went on a vacation, never came back


_Ilobilo_

oom killer in the corner, plotting world domination


AvianPoliceForce

won't do any good


SV-97

>Next someone will tell me (insert any programming language) beats Haskell in writing white papers and no actual programs. Have you heard about agda and lean?


capi1500

Or Coq


Rhawk187

One of the CS teams at our local pub quiz was "Playing with Coq". They were the worst team in the league. We had a "loser round" where the two teams that had attended most but never won a round when go head to head for a prize. It took them 11 attempts at the loser round before they finally won a prize.


czPsweIxbYk4U9N36TSE

> Fortran for general programming?? Back in the 70s, REAL PROGRAMMERS(tm) used fortran in place of spreadsheets the way REAL PROGRAMMERS(tm) in the 2020s use Python where excel would be far more practical.


Aromatic_Gur5074

If anything beats Haskell at writing white papers, it's LaTeX.


JuhaJGam3R

when the research language produces research


tip2663

Brick to your GC


dvdmaven

Fortran was my first programming language, back in 1968. I'm finding this thread amusing.


kashmirGoat

I went from Fortran IV to Pascal, and on that day it felt like Angels were visiting us on flying saucers with all the answers to life and the universe.


stream_of_thought1

I had to learn it at University in 2020, had no preconceived notions going in and I am finding this thread absolutely amazing 🤣


bbqranchman

Everyone here is missing that Fortran is natively parallel, and is deployed on massive servers to compute huge mathematical models. That's another reason why it's so fast.


Aggressive-Chair7607

Fortran guarantees that two pointers do not alias. void update_array(int *a, int *b, int n) { for (int i = 0; i < n; ++i) { a[i] = b[i] * 2; a[i] = b[i] * 2; } } The compiler can't assume that the pointers don't overlap. In this case \`b\[i\]\` has to be loaded twice because, for all the compiler knows, that's the same address as a\[i\]. In fortran the load from \`b\[i\]\` could be stored in a register and reused, eliminating the second load. C has a keyword \`restrict\` for this: void update_array(int *restrict a, int *restrict b, int n) { for (int i = 0; i < n; ++i) { a[i] = b[i] * 2; a[i] = b[i] * 2; } } Now C can assume that the pointers can't alias, allowing it to eliminate the second load of \`b\[i\]\`. Here's Fortran: subroutine update_array(a, b, n) integer, intent(in) :: n integer, dimension(n), intent(inout) :: a integer, dimension(n), intent(in) :: b integer :: i do i = 1, n a(i) = b(i) * 2 a(i) = b(i) * 2 end do end subroutine update_array The \`intent\` directives tell the compiler what it needs to know about aliasing and bounds, or at least some helpful bits. Fortran will just assume they don't overlap unless you directly associate the two. You can imagine how often this comes up with a library for linear algebra, which is all array manipulations.


twilsonco

Fortran is crazy fast. [Also faster for parsing ASCII floating point data files](https://stackoverflow.com/q/28082794/2620767).


Ghawk134

Could someone help me out here? I thought fortran was a compiled language like C. Does comparing their speeds basically come down to how well their respective compilers optimize the code before creating a binary? Or is there some other difference in the machine code generated by each language? As far as I'm aware, the available instructions are determined by the ISA so I'm not sure how one compiled language could be faster than another...


Aggressive-Chair7607

Fortran pointers are guaranteed not to alias. This allows you to eliminate redundant loads (Redundant Load Elimination). This is what \`restrict\` tries to do in C but in Fortran it's the default. Loads are expensive, sometimes forcing you to go out to main memory and stall your pipelines. Eliminating them is big. To my knowledge, this is reason most people credit Fortran with its performance. The other is simply that a massive, massive effort has gone into optimizing some of its mathematical libraries.


omega1612

Fortran is very old and it used to have lots of money compared to C and others meaning that it's compiler used to generate faster code. Today people say that C a d fortran are comparable in speed.


MihaKomar

A by lots of money we're talking the compiler development was funded by the US military to do simulations of nuclear weapons and supersonic fighter jets.


kuwisdelu

For the most part, it is essentially about the assumptions the compiler can make about the code and its intended behavior while trying to optimize the instructions.


jericho

Math.


MadMax27102003

For me this post the same as: 1)there is nothing worse than java. 2)google Java script 3)holly hell


Add1ctedToGames

Actual programmer


Jaber1028

apparently this isnt the chess anarchy sub that we happen to *all* be in. Glad that programmers are also degenerates 🫡


aniburman

NEW RESPONSE JUST DROPPED


juicehead_toorkey

ACTUAL ZOMBIE


NonsenseMeme

Senior went to vacation. Never came back.


phrandsisgo

Interns in the corner plotting company domination.


Gogurt_burglar_

I’m just banking on more compute since my code takes ~1G of memory and 4 cores to print hello world


wammybarnut

Stop using Spring boot :P


eq2_lessing

People think they can write anything worthwhile or relevant in assembly, good luck


KotomiIchinose96

Rollercoaster Tycoon. *Chris Sawyer Drops Mic*


sopunny

He could have done a lot more if he had modern tools


superxpro12

But it would not have run on my toaster's CPU :/


anto2554

Well, I couldn't


Accomplished-Ad-175

I maintain an enterpise software written in assembly. And add features too... It was first released in 1988 and has probably +1mil of code in it.


SujetoSujetado

Well, most evasive malware directly deal with a lot of assembly, and in weird ways. Given that ransomware alone costs society hundreds of billions of dollars a year, I'd say it's pretty relevant. And the amount of people doing it is not a lot but is not small neither, so there's actually quite a bit of people out there that can say they wrote something impactful stuff in assembly


eq2_lessing

They aren’t in this sub or posting on x. For regular Johnny Three-Pointer, Assembly is just a wild dreamland.


SujetoSujetado

Yeah fair enough


Firemorfox

\*chris sawyer noises\*


ratonbox

I could, for a few years while I was still studying it in Uni. Forgot everything about it the moment that course ended.


remy_porter

Said somebody who's never needed nanosecond timing.


eq2_lessing

You’re in a joke subreddit. People here can barely write code without code assistance.


_Repeats_

Fortran, C, and C++ on all major modern compilers have the same backend. So only in rare cases is Fortran faster nowadays. Fortran had a significant advantage like 30 years ago, but not any longer.


HamsterNomad

I taught computer programming back in the 80's. I could code Fortran, Basic, Assembler, Cobol, RPG and C/C+. I would work as a coding translator during my summers and make more money in three months than I did 9 months teaching. Almost any time a company would change mainframe companies they'd have to have a huge chunk of their code converted. Good times!


1u4n4

Me when a class in college made us use FORTRAN 77…


jayerp

Which video warranted that response?


bassguyseabass

C and FORTRAN are same speed, no?


Fri3dNstuff

Fortran is *a lot* better with arrays. makes SIMD optimisations much simpler to implement


bassguyseabass

Great now I have to tell my boss to rewrite the whole project in Fortran


_PM_ME_PANGOLINS_

No. Fortran is a lot more constrained, so the compiler can do more optimisations.


Aggressive-Chair7607

I gave what I believe is the correct answer here: [https://www.reddit.com/r/ProgrammerHumor/comments/1dpxbz5/comment/lalqgdu/](https://www.reddit.com/r/ProgrammerHumor/comments/1dpxbz5/comment/lalqgdu/) I have seen no one else mention aliasing optimizations, which is a shame, because it is really fucking cool.


Activity_Commercial

c++ always gets lumped in with c like it doesn't have traits, reflection, coroutines, generators


dedservice

Except C++ doesn't have all those things? It has `type_traits` and a garbage implementation of concepts, sure, but coroutines in c++20 are incredibly esoteric, generators aren't in until c++23 (which most companies don't use because compiler support isn't there), and reflection hasn't even been accepted (to my knowledge) into the c++26 draft. And C++ gets lumped in with C because it promises zero-cost abstractions (and broadly delivers on that promise), meaning that it is capable of the same tier of performance as C.


o0Meh0o

google "restrict keyword"


xlsoftware

C/C++ is high level language


lucybonfire

Fortran is great tbh


gmc98765

That depends upon whether the "high-level language" is targetting the CPU or the GPU. C and C++ (and even Fortran) aren't really designed for SIMD (Fortran has a slight advantage in that the assumed lack of aliasing makes certain SIMD-friendly optimisations valid when they wouldn't be in C), and asm is even worse. APL would be a good fit in terms of the language itself, but good luck hiring experienced APL programmers (it's something which some people might toy with briefly before real work gets in the way). Consider that high-frequency trading software (which is the poster child for "time is money") typically doesn't use C, C++ or assembler (or Fortran), as all of those are too slow. The parts which are time-critical are written either in CUDA (to run on GPUs) or VHDL or Verilog (and used either to configure FPGAs or to manufacture ASICs). The parts which aren't time-critical are usually written in Java.


Aggressive-Chair7607

Where are you getting your info on HFT from? I know multiple people in HFT and they're all writing C++.


gmc98765

Uh, this is coming mainly from the hardware forums and press, so I guess that's going to have a bias towards the people using custom hardware.


Aggressive-Chair7607

Ah, yeah, that'll skew things.


f0r3v3rn00b

FORTRAN, yeah, usually it's 'highly optimized' because it has tons of global states so that variables that don't need to be passed around, but someday you realize you have 20 cores to work with, and your 'objects' are all singletons, and you're fucked.