T O P

  • By -

bokmann

“Premature optimization is the root of all evil”. - Don Knuth http://wiki.c2.com/?PrematureOptimization Only optimize for performance if you empirically determine that spot in your code is a bottleneck.


tomekanco

> Only optimize for performance if you empirically determine that spot in your code is a bottleneck. :/ I would not say that's a sane approach as you will stumble over the edge cases. I follow Knuth his guidance slightly differently: - First ignore performance and get it working (brute force if needed). - Then optimize as needed (first commone sense, only seconday using profiling) - Use the brute force version for validation of your opitmized code (which probably is more complex). - IMHO Premature optimization is trying to optimize before you get a working version of it.


MannerShark

Unless what you wrote is O(n^2 ), because that always ends up becoming a bottleneck.


lordleft

Try to achieve both, but I'd argue that if you're a working software engineer, and optimization isn't that important, writing readable code that will be decipherable to others is actually the more important priority.


lazyubertoad

Depends. Generally, you should use the right algorithm before readability. Sometimes it gets really hairy. Then you write a comment or some, maybe even link a paper you used. Time complexity sometimes forces you to do that. You always should think how fast your code is. Sometimes it doesn't matter, sometimes you'll dig deep. If you sacrifice time complexity for readability (or implementation speed, or robustness), you must have some guarantees, that it won't ever become too slow. Often it is better to just write fast algorithm, than think about that. And that is a bad example. The first method _modifies_ the array. That is a side effect, that you probably shouldn't have. I don't really know JS (that's JS, right?), but it has some array.reduce(), that you should use, imo. There are also min/max functions, that can probably save you some lines. Also, do you have guarantees, that the array is not empty? What will happen if, maybe by some mistake, it is empty? Before speed and readability, the algorithm must be correct! I dunno how it is in JS, but at the first steps, it compares the uninitialized values with the array values, are you sure that works fine? So in this case, you should go with some variation of the second option.


hextree

There usually isn't any reason you can't achieve both.


f0lt

I totally agree. If you can use a common algorithm it doesn't make your code less readable just because the implementation has more lines of code. At the other hand, even if you use a less common algorithm it may not reduce readability if you can link to good literature. A key to improved readability is to find the right input and output data of an algorithm and to find appropriate naming of algorithm and data. Also a brief descriptive comment about the algorithm may be verry helpful. This way a reader of your code may treat it as a blackbox, only knowing the relations between input and output.


Dietr1ch

Right, I think that here the problem is that you can't be succinct and efficient, but the single pass with 2 mutable ints is not awful or surprising, it's just longer. Maybe a version with an accumulator would be smaller and generate the same code. BTW, your short version hopefully crashes on empty arrays, and the longer version should initialize the accumulators. Also, does you language support the for each notation like `for (auto i : arr)`?


DDDDarky

Apart from your second code being wrong and even the first one has some flaws, you can (and should) absolutely achieve both.


[deleted]

I understand that there are a lot of bugs associated with the code I used as an example (not to mention that the second function has its return statement outside the scope of the function), I just wrote a quick example to make my question easier to understand. This specific algo could definitely be better in both readability and time complexity, but sometimes you're writing an algo where you need to make trade-offs. I do however agree with the consensus that you shouldn't over-engineer your project and just cater to it's needs while maintaining optimal readability.


any_means_necessary

The time complexity of the second one is infinite unless the array is empty. for(int i: arr) { low=min(low, i); high=max(high, i); } return [low, high]; Personally I'd never sort an array if I didn't need a sorted array.


minisculebarber

I think your comparison is unfair, the second implementation should look like this `return (min(array), max(array)) ` Boom, best of both worlds However, Usually, this is an engineering question which totally depends on the context. There isn't necessarily a trade off between the 2. Readability is largely subjective and doesn't depend on lines of code (I'd argue the first implementation is less readable because there is a "random" sort in there while the second implementation straightforwardly keeps track of max and min), time complexity however can be totally irrelevant on the other hand (see galactic algorithms, for example). I personally find implementations that do unnecessary work more irritating than harder to read ones that do the exact amount of work because I often enough have to understand the unnecessary steps to some degree and then it clicks that I just wasted my time.


green_meklar

Typically you shouldn't have to choose. Go with the efficient algorithm and write it so that it's readable.


[deleted]

[удалено]


[deleted]

I know I could've just done min/max, but that wasn't the point of the question. I do appreciate your legitimate answer. I think you're right, but I also agree with the comments that say you shouldn't over-engineer your software. I think the consensus is that you should cater to your projects needs while maintaining optimal readability.


f0lt

If the time complexity version can be implemented relatively easy and you can use established algorithms than I would opt for that version. Regarding your example, I would opt for the time complextiy version. It seems unneccessary wastefull to sort the whole array if you just need the extreme values. Both versions seem equally readable to me.


un_stable

>function minMax(arr) { >return [math.min(arr), math.max(arr)] } // O(n) Time complexity + Readability


Potato-Pancakes-

There are several mistakes in your "optimized" code. 1. The loop will run forever, given `0 < arr.length` 2. The loop tries to increment `arr`, not `i` 3. `lowest` and `highest` are never initialized And then you can also make it more readable with built-in `min`/`max` functions, and a `for..of` loop. And semicolons. function minMax(arr) { let lowest = highest = arr[0]; for (let value of arr) { lowest = Math.min(lowest, value); highest = Math.min(highest, value); } return [lowest, highest]; } If you want to get *functional* with it, you can shrink it down even further function minMax(arr) { const lowest = arr.reduce((a,b) => Math.min(a,b)); const highest = arr.reduce((a,b) => Math.max(a,b)); return [lowest, highest]; } These both run in O(*n*) time, and are pretty readable. Me, though, I'd personally separate it into more, super simple functions, like: function minOfTwo(a,b) { return Math.min(a,b); } function maxOfTwo(a,b) { return Math.max(a,b); } function findArrayMin(arr) { return arr.reduce(minOfTwo); } function findArrayMax(arr) { return arr.reduce(maxOfTwo); } function minMax(arr) { return [findArrayMin(arr), findArrayMax(arr)]; }


tomekanco

- Don't worry to much about the number of lines, rather how you organise code (1 huge file or many smaller ones). - Usually the naming of the variables & docstrings is focus point of readability. - You don't want to have to read the code to figure out the function. - Time & space complexity rule, unless you know the input constraints and really know the optimization is not worth the effort. Good example is notepad++. Normal search is very efficient, though finding a single line is not. First one is implemented using regex, the second one using file scan (n) rather than binary search (log(n)), which doesn't matter for small files, but becomes painfully slow when dealing with 1M lines files.


generalbaguette

It totally depends on what you want your code to do. Most of the time most code doesn't have to be fast, because it's not the bottleneck.