Jonathan Blow's HN comments


context

I felt obliged to comment because I feel I know what you are talking about and I also worry that much of the advice posted so far is wrong at best, dangerous at worst.

I am 42-year-old very successful programmer who has been through a lot of situations in my career so far, many of them highly demotivating. And the best advice I have for you is to get out of what you are doing. Really. Even though you state that you are not in a position to do that, you really are. It is okay. You are free. Okay, you are helping your boyfriend's startup but what is the appropriate cost for this? Would he have you do it if he knew it was crushing your soul?

I don't use the phrase "crushing your soul" lightly. When it happens slowly, as it does in these cases, it is hard to see the scale of what is happening. But this is a very serious situation and if left unchecked it may damage the potential for you to do good work for the rest of your life. Reasons:

* The commenters who are warning about burnout are right. Burnout is a very serious situation. If you burn yourself out hard, it will be difficult to be effective at any future job you go to, even if it is ostensibly a wonderful job. Treat burnout like a physical injury. I burned myself out once and it took at least 12 years to regain full productivity. Don't do it.

* More broadly, the best and most creative work comes from a root of joy and excitement. If you lose your ability to feel joy and excitement about programming-related things, you'll be unable to do the best work. That this issue is separate from and parallel to burnout! If you are burned out, you might still be able to feel the joy and excitement briefly at the start of a project/idea, but they will fade quickly as the reality of day-to-day work sets in. Alternatively, if you are not burned out but also do not have a sense of wonder, it is likely you will never get yourself started on the good work.

* The earlier in your career it is now, the more important this time is for your development. Programmers learn by doing. If you put yourself into an environment where you are constantly challenged and are working at the top threshold of your ability, then after a few years have gone by, your skills will have increased tremendously. It is like going to intensively learn kung fu for a few years, or going into Navy SEAL training or something. But this isn't just a one-time constant increase. The faster you get things done, and the more thorough and error-free they are, the more ideas you can execute on, which means you will learn faster in the future too. Over the long term, programming skill is like compound interest. More now means a LOT more later. Less now means a LOT less later.

So if you are putting yourself into a position that is not really challenging, that is a bummer day in and day out, and you get things done slowly, you aren't just having a slow time now. You are bringing down that compound interest curve for the rest of your career. It is a serious problem.

If I could go back to my early career I would mercilessly cut out all the shitty jobs I did (and there were many of them).

One more thing, about personal identity. Early on as a programmer, I was often in situations like you describe. I didn't like what I was doing, I thought the management was dumb, I just didn't think my work was very important. I would be very depressed on projects, make slow progress, at times get into a mode where I was much of the time pretending progress simply because I could not bring myself to do the work. I just didn't have the spirit to do it. (I know many people here know what I am talking about.) Over time I got depressed about this: Do I have a terrible work ethic? Am I really just a bad programmer? A bad person? But these questions were not so verbalized or intellectualized, they were just more like an ambient malaise and a disappointment in where life was going.

What I learned, later on, is that I do not at all have a bad work ethic and I am not a bad person. In fact I am quite fierce and get huge amounts of good work done, when I believe that what I am doing is important. It turns out that, for me, to capture this feeling of importance, I had to work on my own projects (and even then it took a long time to find the ideas that really moved me). But once I found this, it basically turned me into a different person. If this is how it works for you, the difference between these two modes of life is HUGE.

Okay, this has been long and rambling. I'll cut it off here. Good luck.


context

Being a very experienced game developer who tried to switch to Linux, I have posted about this before (and gotten flamed heavily by reactionary Linux people).

The main reason is that debugging is terrible on Linux. gdb is just bad to use, and all these IDEs that try to interface with gdb to "improve" it do it badly (mainly because gdb itself is not good at being interfaced with). Someone needs to nuke this site from orbit and build a new debugger from scratch, and provide a library-style API that IDEs can use to inspect executables in rich and subtle ways.

Productivity is crucial. If the lack of a reasonable debugging environment costs me even 5% of my productivity, that is too much, because games take so much work to make. At the end of a project, I just don't have 5% effort left any more. It requires everything. (But the current Linux situation is way more than a 5% productivity drain. I don't know exactly what it is, but if I were to guess, I would say it is something like 20%.)

That said, Windows / Visual Studio is, itself, not particularly great. There are lots of problems, and if someone who really understood what large-program developers really care about were to step in and develop a new system on Linux, it could be really appealing. But the problem is that this is largely about (a) user experience, and (b) getting a large number of serious technical details bang-on correct, both of which are weak spots of the open-source community.

Secondary reasons are all the flakiness and instability of the operating system generally. Every time I try to install a popular, supposedly-stable Linux distribution (e.g. an Ubuntu long-term support distro), I have basic problems with wifi, or audio, or whatever. Audio on Linux is terrible (!!!!!!), but is very important for games. I need my network to work, always. etc, etc. On Windows these things are not a problem.

OpenGL / Direct3D used to be an issue, but now this is sort of a red herring, and I think the answers in the linked thread about graphics APIs are mostly a diversion. If you are doing a modern game engine and want to launch on Windows, Mac, iOS, and next-generation consoles, you are going to be implementing both Direct3D and OpenGL, most likely. So it wouldn't be too big a deal to develop primarily on an OpenGL-based platform, if that platform were conducive to game development in other ways.

I would be very happy to switch to an open-source operating system. I really dislike what Microsoft does, especially what they are doing now with Windows 8. But today, the cost of switching to Linux is too high. I have a lot of things to do with the number of years of life I have remaining, and I can't afford to cut 20% off the number of years in my life.


context

I am surprised that so many people here are okay with the idea of being momentarily detained as a matter of course.

This is not a turnstile, because turnstiles do not detain you or trap you; you can always move freely on one side or the other. This device detains you and then lets you move on. What are you going to do when it decides not to let you move on?


context

Any time one is trying to have a serious argument, one must do it from the Principle of Charity:

http://en.wikipedia.org/wiki/Principle_of_charity

This article doesn't even come close. Elon Musk is in charge of the design of rockets that have successfully delivered payloads to the ISS. It is just basic competence of a reasoning mind to presume Elon knows some things about thermal expansion (rockets get very hot!).

So someone who is trying to have a reasonable argument would say, okay, he understands this issue, so I wonder what the answer is and why he doesn't think it is a big enough deal to go into detail on this point. Or perhaps I misunderstand something about the design (always a reasonable assumption!)

This article is about as far in bearing from that as can be. I don't find it to be worth reading.


context

The attempt to put a positive spin on the ad-version is kind of absurd.

Translation: "We tried to serve ads in a way that broke basic functionality for many people. But we didn't make that much money, so we are going to stop being malicious actors, and we're going to start following the protocols we're supposed to follow."


context

I agree that latency is terrible, but this is bad design that actually makes the situation worse. Now instead of a constant, learnable latency, you have latency that varies wildly depending on what page you are on.

This is the wrong solution, in that it makes things more complicated and results in a generally poor experience even after the change. But that is what Google does all the time in UI, so I guess I have no reason to be surprised by this.

The proper solution: Do not make double-tap be a UI action. Done.

Or, if you insist on double-tap, make it only be an action that stacks transparently with single-tap. For example, in touch controls for The Witness, single-tap makes you walk toward the target. Double-tap makes you run. So as soon as we read a single tap, we can start doing the action without delay, and if we see the second tap we just kick up your target speed. It works great.

In short, hey Google, please stop doing band-aid solutions that make things worse, and hire some people who really have solid design vision, and give them the power to get things done.


context

This is not just middlebrow dismissal, it's outright wrong. In the AAA world we would love to spend only 3 weeks of an engineer's time to get a 15% speedup. Seriously, that is a great deal, it's like, where do I sign up?

However, it becomes substantially less impressive when you notice that you're using 2x or 4x the amount of processor hardware (2 or 4 cores) and only getting a 15% speedup. In a by-hand implementation that would be very disappointing.

If it were fully automated that would still be pretty valuable, but it appears that this isn't. So it seems to be of questionable utility.


context

Ugh. I prefer the Carmack version.

This example turns the code into something that appears more airy but in fact is much harder to understand due to extensive use of ? :.

I find that one of my own major steps toward programming maturity happened when I stopped doing goofy things like this and started writing code that was as simple as possible to logically follow, and that was as un-special-cased as possible. (By this latter I mean, if you change the code a little bit, you don't have to rewrite it; it looks basically the same. Imagine you want to do more than just assign one variable inside the clauses of the 'if'. In the Carmack version you just add more code there. In the proposed substitute, you have to rewrite the whole thing.)


context

In the answers I do not see any mention of the Second Time Around Problem, which conclusively answers the question with "there is no way to know".

The idea is: suppose there is a universe that has elements that are fundamentally random. It is nondeterministic. Well, let that universe run for its lifetime, and record everything that happens. Then make a deterministic universe that just plays back the recording (this is the "second time around").

From the viewpoint of someone living inside the universe, there is no way to tell whether it is the first time around or the second time around.

...

But the other thing to point out is that this question presumes an old idea about the passage of time, which is that things happen in a sequence A, B, C, D, ... and that if you are at C then D "has not happened yet". But if you look at relativity, this appears to be a nave viewpoint. In relativity, the time at a faraway point in space that you would consider "simultaneous" with your own clock depends on the relative speed between you and that point. As you speed up and slow down, you can make a faraway point "go forward or backward in time" with regard to which moment there you would consider "now". The crazy thing is that for angular movements the relative speed is amplified by distance, so when you are moving around at everyday speeds the "now" on planets across the galaxy is going back and forth by thousands or millions of years. (This is hard to observe because you are viewing tiny amounts of light from very very far away that have been traveling for a very long time, and the light that you are about to see was very close to you when you did the angular movement so it will not be much affected, etc, but hey, the math says what it says, you either believe what physics tells you or you don't.)

So when you make a distant "now" go forward, then backward, then forward again, do you expect the two forwards to be the same, or not?


context

Those of us who own electric cars know that electric cars have already won. They are just too good compared to regular somewhat-junky gasoline cars. The benefits are too numerous to list.

The only problem that I have with my Roadster is that if I want to take a long road trip, there has to be a big stopover in the middle, which makes trips much longer. The Model S solves that. Therefore, EVs have won. They are here now, they are real, and they work. All that needs to happen is for the cost to come down, but there is nothing preventing that.

If you have parking at home, an EV is way better than gas or hydrogen because you never need to take your car somewhere to fuel it, ever (unless on a very long road trip). You just come home, park, and plug it in. It is hard to communicate how good this feels until you've had an EV for a few weeks and you're driving past all these gas stations and kind of laughing because you don't need them.


context

How are you supposed to deal legally with your government doing something bad, when it is illegal for you to know about the bad things?

Just think about your proposition for even ten seconds.


context

Just get rid of the damn headers. Users don't want them. They are just misguided ways of trying to raise retention, but really what they do is make your site less pleasant to use, which then makes me not want to come back later.

context

It is the government's own claim that, for example, the President didn't know what the NSA was doing:

http://www.cnn.com/2013/10/28/politics/white-house-stopped-w...

In the absence of leaks and whistleblowing, how do you propose that this kind of power be reined in?


context

The article is hard to read because it feels like this guy is really fooling himself.

If he were as good in academia as his rhetoric claims (building software that "revolutionized" a field) he should have no problems. He should not even need a job, as he ought to be able to just start something. He should have no shortage of strong ideas about what he could be doing.

Instead he is aimlessly searching for a job.

So, I have no choice but to disbelieve his rhetoric. He probably isn't particularly good at anything, and just stumbled through the PhD system. Well, surprise, that isn't worth much!


context

They are not OpenGL derivative. I don't know why this rumor propagates. Oh wait yes I do, because those of us who know what the APIs look like in detail generally can't say anything about them because of NDAs. Sigh.

To be clear: None of the APIs you listed are based on OpenGL. PSGL was an OpenGL implementation, but nobody writing a high-performance game used it because it was too slow and unreliable. The APIs used by almost all shipping games you can think of are substantially lower-level.


context

Actually, no, this is just the timing with which things happened.

Mike Abrash still worked at Valve last week.


context

Yet another Visual Studio, yet another truckload of "features" that do nothing to help day-to-day, heavy-lifting programmers. (And which probably help by adding bugs or just bloating the system). Sigh.

I am hoping they have made substantial improvements that are not mentioned in this blog post.


context

The impact of variable latency is well-known among people who actually study human factors. It makes it hard for your brain to internalize interactions and make them automatic, and it increases the feeling of frustration.

In video games specifically, where I work, it is well-known that you would rather have a game that runs slower, but with a solid frame rate, than a game that has a highly-variable frame rate that is faster on average. (This is more of a continuum situation than a discrete action like a tap, but the basic principle holds).

P.S./edit: Double-tap only means zoom because that is what they happened to implement. It is a convention easily changed. Look at how heavily Apple just revamped their entire interface, for an audience that is arguably much less savvy than the Android audience.

Using double-tap in this way was a mistake; it is an easy mistake to fix if you have enough design vision to know where you are going.


context

You are presuming a lot about what my early career looks like. Really I never worked any jobs as bad as what you describe. Well, maybe the time I did data entry during my first year of college. The longest I ever worked in a highly corporate environment was 7-8 months, but at least at that time I was doing something at least slightly cool (a port of Doom 2 back when that was a new game, and with which I had full autonomy).

What you seem to assume is normal is some kind of career hell I would never want to be in (nor have ever been in).


context

I have owned 4 or 5 Optimus laptops and have never gotten a practical benefit from Optimus. At best I leave it switched to the fast GPU permanently. At worst, something goes wrong.

The state of Windows laptops these days is really kind of embarrassing. Nobody knows how to build something of quality. Everything appears to be driven by bullet-point features with a goal of providing USPs, but nobody really cares if the features work or if the overall product is good.

This extends to everything (keyboard and trackpad design, screen, preloaded software, function key mappings, etc).

I buy 2 or 3 laptops a year; traditionally the case has been that most laptops were kind of bad but if you looked hard you could find something good. Now it has gotten to the point that the something good no longer seems to exist at all.

I dread the idea of buying a new laptop now. This can't be the high-level result that these OEMs really want.


context

Many game programmers decided long ago that object-orientedism is snake oil, because that kind of abstraction does not cleanly factor a lot of the big problems we have.

There isn't anything close to unanimous agreement, but the dominant view is that something like single inheritance is a useful tool to have in your language. But all the high-end OO philosophy stuff is flat-out held in distaste by the majority of high-end engine programmers. (In many cases because they bought into it and tried it and it made a big mess.)


context

Networking is for people who don't know what they are doing and who don't have better ideas regarding what to do with their time and energy. So if you go to a networking event, understand that you are automatically putting yourself into this class of person.

If you are someone who provides a lot of value, other people will go out of their way to meet you, and then you don't have to go to networking events. So the fact that you are doing networking implies that you are someone who does not provide a lot of value (or else that people don't know what value you provide).

Do you think Elon Musk goes to a lot of networking events? Do you think Steve Jobs went to a lot of networking events?

If you are early in your career and legitimately aren't providing a lot of value yet, because it's early, then I would offer that your time is much better spent cloistered away becoming excellent at what you do, than it is networking. Because if the arc of your career involves you being excellent at what you do, then very quickly you will find that people you meet randomly at events like this are not in your league -- that's just how things are everywhere all the time.


context

I think it just plugs into the Ayn Randian / objectivist mindset. "Some people are hard workers, and others are parasites, let's identify the parasites and axe them and we will have a utopia."

Yes, nobody thinks of it in such an outright simplistic way, but I do think this is the basic appeal.


context

Cute site. It doesn't change the fact that the iOS icon standard is terrible, though. (It defeats the human visual system's attempts to recognize objects by silhouettes, which is why so many people take so long to find the icon they are looking for).

context

I have never encountered a situation like this before in an airport. If I were to, I would be very disturbed, and avoid that airport in the future. So, this is new and unusual to me. (Apparently, also, to enough people to make this video notable).

Subways that I have seen that require fare cards on exit do not actually prevent you from leaving; you can jump the turnstile or go through a gate on the side that possibly alarms. This is not the greatest thing, but at least while you are inside you can move freely inside a large subway system; being encased in plexiglass until a light turns green is a whole different degree of trapped.

Edit: And I guess this is an important part of the point: degree matters. The old boiling-a-frog story is just about slowly increasing the degree of something.


context

This is a contemptible article. It seems mostly to be a product of PR, and for an American audience it seems intended to perpetuate the cognitive capture regarding razors being things that a lot of thought goes into and that therefore justify high prices.

Do the math for how much a razor "should" cost, and ask why you can't buy one for remotely that much at your local Walgreens. Ask why stuff like Dollar Shave Club exists, and then look at their prices and figure out what their margins are.


context

I have done meditation for years now and found it to be very beneficial.

However, I find this article to be very Cargo Cult and am disturbed that nowhere in this entire thread has it been called out as such.

"Look! Meditation must do things because we can make these colored charts telling you about beta waves. What are beta waves? Well, it doesn't really matter, just think of them as bad, because look, meditation does things to your brain, okay??"

The benefits of meditation to mood, creativity, etc are pretty easy to verify for yourself, subjectively. It disturbs me that we feel that adding scientismic mumbo-jumbo gives it credibility somehow. What is presented in this article is not actual science.

There is actual science involving meditation and the brain, but it is in extremely early stages and is hard to draw conclusions from. Our understanding of the brain, in general, is very early! Please be suspicious of pretty colored charts showing brain activity.


context

Yes, and I am surprised that this didn't up as answer #1 in response to the original question. This problem is an absolute showstopper if you need to manage binary assets (which for something like video games, you absolutely do -- you want to know that your data is the right data for the current version of the code, because the data is changing all the time too).

In The Witness (http://the-witness.net/news) we have 20GB of data checked into svn. Try that with git.


context

It looks to me like the author is trying hard to justify localization because he wants it to be worthwhile, but looking at the numbers, I don't see the evidence. The reasons he gives seem like rally big stretches and factual cherry-picking.

It is nice to make your game available in many languages, but getting translations that aren't terrible is hard, and I have never seen a clear business case for it. So I think the proper attitude is "there is not an obvious business case, but we are doing it because we want to."


context

And then you can cause Google to DDoS someone else's site by sending out spam containing lots of image URLs.

context

I think part of the point is that NASA was pretty much flailing even just in the "send shit into Low Earth Orbit" department.

context

I think it actually is different now.

When I was in college (1989-1993), people did CS stuff because they thought computers were cool and could do things that were kind of amazing. Some were more visionary than others, of course. There was a vague idea that you could make a lot of money doing it, but it wasn't why one did things.

Now it seems the opposite of that. Especially here. Everyone is all startup, startup, startup. Honestly it makes me feel a bit ill, because there is very little talk of why one might do things and what is ultimately important.

I do think the support network built up by YC is great, and it is really exciting to see that young people just out of school can find support to go and do something new and interesting. But I think the actual projects coming out of this process are usually kind of bankrupt when it comes to things that I value.

I went to YC Demo Day a year ago with the intention to definitely invest, but I got so disheartened by the projects being presented that I never ended up investing anything. (Actually I wanted to write a check to Leftronic but they never returned my email.)


context

Sorry, but you are playing some really shitty games, then.

context

Note the extreme lack of women commenting in this thread, as well as the lack of discussion of any of the substance of the livestream.

This thread is 100% nerdy dudes feeling offended by this event, plus other dudes attempting to counter this.

Is not this thread itself indicative of a giant problem?


context

The only scientific viewpoint is that we do not know whether or not strong AI is possible. Materialists think it probably is, since they can't see any reason why not. But we don't understand intelligence so we don't know if we are missing something. Meanwhile, there are plenty of reasonable reasons not to be a materialist.

As a meta-comment, I find the condescension in your comment unnecessary. Why is he "spewing" and why is it "quackery"? Did he not publish a testable, falsifiable theory? Isn't that what science is?


context

I am glad this paper has been written, but already by page 3 there are serious problems with it.

For example: It is an easily observable phenomenon (e.g. by anyone who meditates) that you can be conscious without remembering anything. In other words, consciousness is independent of something like memory.

Yet Max's list on page 3 has stuff like "independence", "utility", "integration" which have nothing to do with observations of what consciousness is actually like. Rather, they are more like high-level ideas of what human beings are like.

But we don't need to explain human beings (complex biological organisms that walk around and do stuff). Science has got that covered already, at least kind of. So if you are going to clearly think about consciousness, you need to factor out what consciousness really is and look at the properties of that.

This is supposed to be a foundational principle of science: that your hypotheses are attempts to explain things that are actually observed. The first step is to observe things carefully! You don't just go making up hypotheses.

So it's a giant red flag any time a scientist writes a paper about consciousness where they conflate it with memory in some way (which is almost every time). It's a red flag because it indicates that the scientist has not actually spent any time observing consciousness, because they aren't noticing things that are obvious to people who have done that.

(You might think that because we are all walking around conscious every day, there would be no need to observe consciousness, but this isn't true. We walk around in a space governed by Newtonian physics, but it took until Newton to figure out this thing called inertia and that a frictional force is required to make things stop, etc, because if you don't look carefully and make careful measurements, most of the everyday world doesn't appear that way at all. Same thing with consciousness.)


context

As a professional game designer of 16 years, I disagree. The designers of these games are not following the traditional idea of game design wherein you are trying to make something fun or interesting. They are deliberately trying to engineer addictive cash syphons.

context

Yeah, as soon as the guy started saying this I discounted the whole article. Yes, maybe Apple will switch to A7 chips, but this author sure doesn't know enough about processors to have any kind of a privileged viewpoint.

If one doesn't even know what uops are, and doesn't have a mental estimate of what percentage of i7 silicon is devoted to the instruction decoder, one doesn't get to write articles comparing Intel to ARM chips.

Edit: Going back to the article I note the author is Jean-Louis Gassee, so this is just bizarre. He is kind of just talking out his ass and I would hope he'd know better than that, because whereas making stuff up is a survival skill in exec-land, being blatantly and demonstrably wrong about said make-ups is not.


context

These guys never heard of shared memory, apparently?

Does Ruby not provide a facility to use shared memory? I guess you don't get it by default in a GC'd language because the GC thinks it owns the world.


context

Anyone who could make Minecraft by himself is at least kind of okay.

I recommend you build a game of similar scope before saying stuff like this.


context

Not true. I am talking specifically about interaction response, not the rate of animation display. It is about your brain learning responses to its actions so that it can better plan its actions. That cannot happen when the responses are semi-randomized.

Seriously, there are tons of studies on this. I am not making it up.


context

I would gladly work on "defense tech" too, as in actually defending people. But what you mean by "defense tech" is actually offense tech. Look at how these things are actually used in the world.

When an entire field needs to stand behind a euphemism in order to prevent seeming ugly, that is certainly some kind of a sign.


context

Middlebrow dismissal.

Believe that all of us working in video games want program execution on mobile devices to be as fast as possible. My brand-new desktop PC is not fast enough for what I want to do, so an Android phone running a bytecode interpreter is that much further.

I also don't think you know what "vaporware" means. If it is actually running on physical hardware it's not vaporware; it is just not in consumer hands yet. (Since Engadget has played with it on a physical phone... it is known to be real.)


context

Some time ago, I wrote some articles on image scaling that go into a bit more technical depth (with a specific focus on mipmapping):

http://number-none.com/product/Mipmapping,%20Part%201/index....

http://number-none.com/product/Mipmapping,%20Part%202/index....

A problem I get into at the end of the second article is that gamma-correction is very important for good image scaling results. However, almost nobody gamma corrects during scaling, even today.


context

Have you looked at EA's financials?

They have been losing money for years.


context

I did a lot of consulting / contract work at AAA companies between 2000-2005. "Parachute in and make the E3 demo work / implement some tough feature / etc". Sometimes it was less well-defined than that.

context

Look ... guys ... this is NOT TECH, unless you classify hooking up a sewage pipe at the RV Park as tech too.

This is just another example of fighting through a gross kludgey mess to do something relatively trivial that we already knew how to do, much more robustly and performantly, in other systems.

Any time you see an article titled "X in HTML5" you know you are about to ride the short bus. I recommend learning some computer science.

See also Alan Kay's comments about the Web vs the Internet, recently linked on HN...


context

The irony of your reply is that this is probably what he would have said (or anyone else writing this blog entry) 10 years ago. The point of view he is writing from is 10 years later...

I am not explicitly saying that you would for sure agree with him 10 years from now, but I want to at least suggest it's a possibility.

For what it's worth, I run a software company, in which I sign paychecks, and I think what is said in this posting is pretty smart.


context

Lots of low-end, non-premium-experience games are made with those things. And e.g. games that use Lua usually only use it for high-level gameplay logic, i.e. most of the running code is in C++ or something.

And yes, it's a problem.

See the other comments here about VR. With VR you want to render at 90 frames per second, in other words, you get 11 milliseconds to draw the scene twice (once for each eye). That is 5.5 milliseconds to draw the scene. If you pause and miss the frame deadline, it induces nausea in the user.

But this comment drives me up the wall:

"GC doesn't seem to be a show stopper for them, you just have to be smart about allocations..."

The whole point of GC is to create a situation where you don't have to think about allocations! If you have to think about allocations, GC has failed to do its job. This is obvious, yet there are all these people walking around with some kind of GC Stockholm Syndrome.

So now you are trapped in a situation where not only do you have to think about allocations, and optimize them down, etc, etc, but you have also lost the low-level control you get in a non-GC'd language, and have given up the ability to deliver a solid experience.

Bad trade.


context

No, because the final ticket purchaser would have been even happier with the ticket plus the extra cash he would have had if he had not needed to buy from the scalper, and/or the venue or promoter would have been happier with the extra money that is instead taken by the scalper.

Come on, this is obvious.


context

Ferrari sells under 8000 cars a year.

Lamborghini sells under 2500 a year.

Chalk this article up to careless punditry, i.e. what you get from an ad-driven internet news model.

Edit: just for kicks, I looked up Rolls Royce, which is kind of the canonical "luxury car". Under 4000 cars a year.


context

O() is colloquially used to mean "approximately", among people who do actually know what O() means.

context

It is the other way around. If you are doing effective altruism, you want to make sure you have money coming in that you can be altruistic with.

The bit about Walmart was especially badly done. If the author of this article thinks Walmart employees are poor on the same scale as the poor that the foundation targets, he really has no clue about world poverty. If the minimum standard of living throughout the world were equal to that of a typical Walmart employee, the world would be tremendously better off.


context

Left to its own devices, a baby will stick pretty much everything into its mouth.

If you believe that evolution generates behavior to maximize survival value, this fact about babies seems relevant.


context

They have an extra space at the end of the name?\nPaste fail.

context

You are misunderstanding the physics. Suppose A and B are two quantum entities that result from a collision. They are entangled and moving away from each other. The math very clearly says that a modification of A's state will alter B's state after the time of collision. In fact, it must do so, since A and B do not have separate states. Their states are the same one little chunk of math.

This even works if the states are known! This is how quantum cryptography works. You put two particles into Bell state with each other, give one to someone else, and then decide later what message you want to send.

I recommend learning the math. It is pretty simple if you already have linear algebra.


context

Sometimes rigid code is simpler, sure. But what I am arguing is that it is almost never more debuggable / maintainable.

What I am saying is not specific to test and branch, though test and branch is great because it gives you these big code blocks into which you can insert more code and it's clear where that code lives and under what circumstances it runs. Which is something you don't get in assembly language, which is part of why the assembly language reply is a goofy straw-man argument.

Yes, my reply was a bit irritable; I would definitely prefer to have a reasonable discussion, but the assembly-language thing was the first volley in being unreasonable. Putting up a straw man like that is an attempt to win the argument, not an attempt to understand the other person's position. I detected this and decided, well, if that's the position, then it is useless trying to make further / deeper rational arguments, so I am just going to say, this comes from a lot of experience, so take it or leave it.

As fatbird replied, "This is shitty." (I can't reply to his reply yet because of the timed reply thing, so I am including it here.) Maybe it is shitty, I don't know, but it's true and sometimes you just have to say the true thing to be expedient and get on with life.

I don't have time to teach people on the internet how to program. I work my ass off for 4 years at a time to build and ship games that are critically acclaimed and played by millions of people. These are the kinds of things most programmers wish they had the opportunity to work on, and wish that they knew how to build. (Often programmers think they know how to build these things, and then they go try, and they fail. It is a lot harder than one thinks). I am not saying this to brag, because I honestly don't feel braggy about it right now. It's just fact. I am pretty good at programming (probably not as good as Carmack) and I have worked really hard for a long time to be as good as I am. Meanwhile I am also trying to be pretty good at game design, and oh yeah, running a software company.

So when I give advice like this, and someone retorts, and it seems to be coming from a place of lesser experience, it is not really worth my time to get into a serious argument. I am not going to learn anything. I have been in the place where I had that kind of opinion, many years ago, and then I learned more. Fine. I can either be polite and quiet about it, or say something a little bit blunt and rude, in the hope that the other person (and maybe any bystanders to the conversation) will seriously re-consider what was said in light of the new information that it comes from someone who is maybe not a joker. I can't spend a lot more time than that teaching everyone in the internet how to program, because it takes almost all the energy I can muster just to build software. (Though occasionally I do write up stuff about how to program, and give lectures bearing on that subject, like this one: http://the-witness.net/news/2011/06/how-to-program-independe...).

Of course this don't-get-into-the-argument strategy of mine has at least partially failed, since here I am typing out this really long reply. I don't know.


context

It is definitely less convenient if you don't have a garage. But as mentioned, this is starting to be addressed. Also, the Supercharger-style charging station becomes a lot like a gas station would be for a normal car.

I do think that "I don't have a garage / dedicated place to park my car" is in fact the only current anti-EV argument that has any basis in reality. But it's addressable.


context

So pick something that requires very little capital, like saving the 16000 children who die every day from starvation-related problems in developing countries.

"I lack capital" is a very silly excuse given that the HN community are among the richest people in the world... by definition, if you have the luxury of even thinking about moving to Mountain View and starting some web site, you are among the richest people in the world. A lot of people wake up in the morning and their first task is to figure out how they are going to eat today.


context

They are announcing that you can watch shows on your television, while hooked into your mobile device, which is being controlled by your tablet device, which is hooked into your oven all while sitting in the refrigerator.

context

I think you didn't understand that paragraph. The implication is that Goldman couldn't get the stock, and in fact did not bother, and just lied and said they did.

context

Why is this on HN? The author of this article does not know anything about the game industry.

The reason most developers of $60 games sign with publishers is because they do not have the money to develop games of that size themselves.

Whether it is a good idea for them to be doing this is really a different question (and it is complex to answer).


context

I am very happy to see this list.

I came to Demo Day in 2010 (as an investor) but left without investing in anything, because I was so demoralized by the way it seemed everyone was trying to start lame web sites doing relatively trivial things.

If Demo Day looked like the stuff on this list, I'd be banging down the door to get in again.


context

I have an iPad Air, and there are definitely basic iOS7 operations that lag and stutter on it. (For example, pulling down the search box).

context

No surprise. Most "scientific" studies in the realm of medicine are bullshit. Even many of the ones trying hard not to be bullshit still end up in that bin. For a clear understanding of why, read this:

http://slatestarcodex.com/2014/04/28/the-control-group-is-ou...


context

Computational approximation has been shipped en masse very successfully.

You mention GPUs but did you know that GPUs already do a lot of approximate math, for example, fast reciprocal and fast reciprocal square root?

You mention how approximation must be impossible in all these applications (because REASONS) but all methods that numerically integrate some desired function are doing refined approximation anyway. If you have another source of error, that lives inside the integration step, it may be fine so long as your refinement is still able to bring the error to zero as the number of steps increases.

Your diagnosis of "utter computational tripe" and the accompanying vitriol seem completely inappropriate.


context

Well wait a minute. They predicted "there is quantum activity in the microtubules". Almost everyone said "you are crazy, that's impossible". If it turns out indeed that there is, then this was a successful scientific prediction, in the style of predicting the structure of DNA or the existence of the neutrino.

You are right, it doesn't mean that this is the seat of consciousness, but it doesn't mean it isn't, either. Further investigation would be required. But saying it's "off base" just because you don't like the thesis is exactly the opposite of what science is supposed to be. It is scientism, not science.


context

I feel like a reasonably successful person at this point, and I don't do "networking". I am not uncommon in this regard; most of the most successful people people in my industry do not do "networking". That's why I posted my original comment; I feel it is a reality, among at least a very strong subculture of talented people, that is underrepresented in rhetoric.

Addressing some of the other replies: I have done certainly a lot of speaking engagements, and yes, these have been very helpful for becoming more known and whatever, but I never do them for that reason; I always do a speech because I have something specific that I really want to say. Any publicity is a by-product (and sometimes publicity is highly aggravating and undesired). I certainly don't try to meet people via speaking events, parties, dinners, whatever. Sometimes I do end up meeting people, but not that often really, and again, it is a by-product.

In my experience, successful people almost always go to a party just to go to a party and relax or see what's up. They aren't going to a party for ulterior motives like maybe meeting someone who they might be able to get something out of and blah blah blah. Actually, successful people often just don't go to parties because they have other things to do and parties where you don't have a strong peer group are not going to be very interesting.

If you have a specific business objective, you are not going to solve that by randomly going to an event and having random conversations. You are going to solve it by calling someone on the phone or emailing them. If you don't have a specific business objective, you probably won't find much traction with whatever you are doing unless you get a specific business objective.


context

I don't think "we haven't seen confusion around this" is a good answer. Most users will not even understand what is happening, they will just have an ambient level of uncomfortability with the UI that they don't know how to explain (not only due to this, but to many other factors as well, all of which contribute).

Don't tell me that your users don't have an ambient level of uncomfortability with the UI, because we all know that is the case (it is even the case on iOS, now much more than it was with iOS 6!)

If I were the UI lead for Android, I would be scouring the system from top to bottom, looking for ways to simplify and streamline, to increase predictability. All kinds of current UI actions would get thrown away (hint to Google Maps people: Shake is not an intentional UI action, it is what happens accidentally 5 times per minute when I am using your software. It should not be bound to anything, ever. [Apple is just as bad for binding shake to Undo, but at least theirs has a higher threshold now]. It is nice that at least now you can finally turn it off after it activates a few times, but haven't you noticed how everyone turns it off all the time, and nobody ever uses it for the intended action ["leave feedback"]? Get rid of it.)

Anyway, all I can say to this response: "This is why we have the beta, to test this stuff."

Is that if this attitude were actually solving all the problems that need to be solved, Android would have an amazing UI that everybody raves about. That is not the case! Why is it not the case?


context

You are responding to something totally different than what I was saying. Yes, we all know that there is a one-way exit from security zones in airports.

When you are passing through such an area, you are free to move unless you are detained, which in theory would not happen without good reason. The idea is that you are free to exit and just cannot go back in.

When you enter one of these chambers, you are detained by default until you are released. You are not free to go in any direction. It is very different.


context

I've flown into Copenhagen, Amsterdam, Cologne, Hanover, Paris, and London, and I do not recall seeing anything like this. I'd think I would remember as it is very alarming-looking.

context

I am, and the answer is: not very much.

I own a Roadster, and after 2.5 years of daily driving, my battery capacity is down by 6%. That's not 0%, but it's also not a big deal.


context

Also, re this: "but being a single platform developer kind of sucks. Become the kind of developer who doesn't care what platform they get to develop on".

I agree with this in principle. But in practice... I invite you to develop large and complicated projects (like non-small games) and see if you retain this opinion. I find that work environment matters, a lot.

The thing that's a little sad is that developing on Linux could be great, if only open source developers had a culture of working a little harder and taking their projects past the 'basically working' stage and toward hardened completion. When things are solid and just magically work without you having to figure out how to install them or what config file to tweak, it drastically reduces friction on your productivity. So there's a productivity multiplier that open source developers are not getting, thus making all their work harder; because hardly anyone works on the basic tools, everyone else's development is more painful and takes longer, but nobody realizes this because everyone is used to it.

If someone made a debugger that let you inspect/understand programs 2x as efficiently as gdb does (I think this is not even an ambitious goal), how much would that help every open source developer? How much more powerful and robust and effective would all open source software become? (At least all software written in the languages addressed by said debugger...)


context

No, you clearly didn't understand my point. I am talking about maximizing the rate of successful features implemented. Programming things in assembly language is obviously not going to do that.

I don't know, man. I have 31 years of programming experience. I am not detecting from your argument that you have anywhere near this level of experience, so I am inclined not to get into this discussion. But I will say that your code example at the end of your comment is exactly what I am talking about. It happens all the time that I want to put something in front of 'return b' (or, in fact, I just want to put a breakpoint on that line in the debugger! Not going to happen in your second example...)


context

Burning coal / electricity / oil to explore, drill, transport, refine, and re-transport oil ... Then using that to fill your car, so you can blow it up inside an engine.

Sounds a little less efficient than the electricity thing when you look at the full sequence. And this is even in the case where electricity comes from dirty sources; some states, like California, have an energy mix where electricity comes mostly from renewables. (And this is something that can be improved as time goes on).


context

As a longtime profesional game developer, I kind of don't get it. Is this company out of business as soon as Oracle implements a loose octree that is ACID? Is there something wrong with spatial hashing?

etc, etc.


context

I have been having similar lamentations recently.

One obvious example: the San Francisco Bay Bridge. The original bridge was built in 1933; it took them less than 3.5 years to build it.

Now, they are just trying to replace the eastern span (the bigger one, but hey, less than the full bridge), with technology from nearly a century later. They have been "working on it" for 9 years. It is currently scheduled to run another 2 years from now (if it somehow gets done on time) and is 6 times more expensive than originally projected.

For a bridge. That doesn't have any more traffic capacity than the bridge it is replacing.


context

Hey Steve,

Who do we have to harangue to get Rust to rename "Vector"? It is kind of embarrassing and confusing terminology, and there is no reason to propagate it. Just call an array an array and an immutable array an immutable array.

There's no good reason to perpetuate the mistakes of the C++ people.

(Note to those not understanding the objection: A vector is defined an element of a group that is closed under addition and multiplication. This is a very specific and very widely-useful meaning, and any program that does stuff with math is going to use vectors. So when you come along and put into your standard the idea that 'vector' means an arbitrary collection of elements that probably are not even scalars, you not only show that you don't know what vector means, but you confuse the programs of many many of your users, because now they have two totally different things both of which are called Vector and that are both used very heavily. [There is no way in hell anyone is going to call a math vector anything but vector, since that would be insanely confusing.])


context

The article is good, but I agree that the headline taken out of context is highly misleading.

In the context of the article it makes sense. But out of context it appears to be saying that all games should cost a lot less than $60, which really isn't the point.

In fact to justify a luxury price point you just have to give people something they really want. For example, right now lots of people are paying $150 to get into the beta for Elite: Dangerous, a game that I presume will be substantially cheaper in full release:

http://elite.frontier.co.uk/

If $60 is insane then $150 is totally nutballs, yet that is what a nontrivial slice of people are paying, because this beta is giving them something they want that they can't get any other way.

(Note: In the alpha, people were paying $295!)


context

If you read the series, you'll see that what you call "rejection sampling" is in fact the first thing that was tried, and it was set aside for performance reasons. Yes you can try to improve the perf of this kind of system, but the goal was to build something that would run with much less overhead than that. And this was accomplished, and it seems to me that the end result runs faster than any optimized version of the system you are proposing. So I'd be careful with dishing out what reads to me here like middlebrow dismissal.

context

What is described is something toward the top end of basic competence. The only reason someone would think this is crazy is if they have been hanging around bad engineers and / or administrators. That said, most engineers / administrators are pretty mediocre, especially these days now that so many more people are doing it. So.

I mean, if "managing office firewalls" is on someone's list of things that are impressive, maybe that person does not have a clear view of the problem space.

But I should not even be giving the article so much attention as this. It's clear the author is confused. At first he talks about how rare and crazy it is to find someone who can do this, but then, contradictorily, he laments that PA will get tons of applicants who can do the job. Well, guy, which is it? They can't both be true.

All that is going on here is that someone had a negative reaction to the job posting and is trying to express and rationalize their reaction, regardless of how that rationalization really matches up to reality. Happens all the time, why am I even replying?


context

You are thinking of a different gamma-correction step. The reason you gamma-correct during rendering is because monitors expect data to already be gamma-corrected, and it is this expectation that causes the problem I am talking about. Data in a bitmap is not linear with the light-intensity of the colors; it is gamma'd!

The simplest kind of mipmapping is a box filter, where you are just averaging 4 pixel values at once into a new pixel value. Thinking just of grayscale pixels, if you add 4 pixels that are each 1.0 (if you are thinking in 8 bits, 1.0 == 255), and divide by 4, you get 1.0 again. If you add two pixels that are 1.0, and two that are 0, you get a value of 0.5. Which would be fine if your bitmap were stored in units that are linear with the actual intensity of light; but they are not, because they are in a weird gamma! What you are going to get is something like pow(0.5, 2.2) which is way too dark.

Thus when you don't gamma-correct during mipmapping, bright things lose definition and smudge into dim things way more than actually happens in real-life when you see objects from far away.


context

Whatever. It is easy to make a single decision that is right or wrong. Or, a decision can be right given limited information ("this thing has low odds of paying off"), but "wrong" given what actually happens ("what do you know, against odds, it paid off!") Anyone who plays Poker knows this.

You can't look at one decision and use it as a barometer to know whether a CEO is good or bad. That is just silly. It makes more sense to look at the body of decisions.


context

Destructoid is much more ad-heavy than other similar gaming news sites, and they tend to implement the ads in a way that hits browsers hard performance-wise, so that the site is just painful to navigate. I stopped going to Destructoid a while back for exactly this reason.

It seems that such heavy ad bombardment has been chasing their regular users to block ads. It only makes sense.


context

This is a "parody site". It is not a real article.

context

My point is that when you are writing complicated production code, and you are a good programmer such that your rate of features successfully implemented is high, then you will often be going to old code and changing that code to behave somewhat differently than it was before.

When you do this, you want that old code to be like putty. You want to bend it into a new shape without having to break it and start over. Sometimes it really is better to break it, if bending would be too messy or cause problems later or otherwise sets off a red flag in your head. But if you have to break and re-make everything all the time, you won't be a very fast programmer. So you learn how to bend things, elegantly.

And after a while of this, you learn how to write code that is more amenable to elegant bending in the first place. When you type code, you are not just implementing a specific piece of functionality; you are implementing that functionality plus provision for unknown future times when you will need to come back and make the code different.

(To link this more thoroughly to the previous comment: it happens all the time that you write code that is not really about doing stuff, but then you later need to make that code be about doing stuff. Sometimes this is for shipping functionality reasons, sometimes it is just to temporarily insert hooks for debugging. Declaring in advance that this code shall never be about doing stuff is usually a mistake.)


context

Wait, now you are comparing a luxury car to a Toyota truck. That doesn't make any sense at all, so your question "Which one makes more economic sense?" doesn't make sense.

By your logic, everyone who buys a BMW M3 or a Porsche Boxster ... or whatever ... is dumb because they should have bought a Toyota truck because it is only $19,000 over 5 years.


context

The Roadster is definitely expensive, but now the Model S, just a few years later, is 1/2 to 2/3 the price, and is a superior car. Thus, the price is declining rapidly, and this is obvious to anyone who cares to look at actual facts.

context

Forgot to mention that I am an electric car owner as well (Roadster).

I don't really like the idea of battery swapping. It seems like a big hassle. The current Supercharger seems totally fine. I guess it remains to be seen how well it works in practice, once a lot of people have the cars, but right now things look really good.

One of the most common pieces of anti-EV rhetoric is "the infrastructure doesn't exist". This is actually FUD. The infrastructure is everywhere: we have electricity pretty much everywhere. (Consider: all these gas stations have electricity, so the penetration of electricity is a superset of the penetration of gasoline). What we don't often have is the right plug, but that is a relatively small problem.

But if you start talking about installing a widespread battery swapping paradigm, then that is a huge infrastructure problem, because you need to have stocks of all these physical things all over the place.

On the other hand, with Supercharger, you don't need that. You just need some electricity. It is much simpler.


context

It is not exactly that developers are "paying for exclusivity", it's that if they want to be on the platform, exclusivity is required, and fee payments are required. Basically, you have to pay to play. The reason a developer does it is because he hopes to sell enough copies on that platform to make up for the fees.

context

And it's worse, because... "only 60fps?"

60fps was fine pre-VR, but now you want to do 90fps times 2 eyes.

That's a lot of rendering.


context

It is not that meditating disables anything; rather, meditation consists in observing consciousness in a way that is orthogonal to these things. When meditating, sometimes (often!) you are remembering things, but sometimes you aren't. And in fact what you come to realize is that this is also true in daily life: there are many, many gaps during the day when you are not remembering or thinking about anything. According to your question, wouldn't you just immediately forget what you are doing and stop in the middle of your day? Well, no, you don't, and you can hypothesize as to why (unconscious processes or whatever) but the point is that unconscious processes can be explained without hypothesizing consciousness (isn't that like what a robot has?)

All I can say is, try meditating sometime and you will see. (I don't recommend mantra meditation, but rather something more like Vipassana or "mindfulness" or any meditation that is not about distracting your mind by keeping it busy).

When you become comfortable with meditation, you become very aware of what your consciousness is doing. You gain a palpable sense for the present moment. Once you have that, it makes a lot of these kinds of questions unnecessary (or at least the questions become very different in nature). If you don't have this taste for the present, then asking/answering questions like this is like trying to explain colors to a blind person. It just doesn't work because most of the questions are about things that don't really have anything to do with consciousness.


context

The scary thing is how proud they are of their system; it has a cute name and everything.

If the person who CTO'd that worked for me, they would not be CTO any more.


context

I disagree with the fact that it happens inconsistently and creates a more-complicated situation for me to internalize; and even if I know the rules (which most users don't), it may not always be clear which are happening unless I engage in experimentation -- can I tell just by looking which pages are going to have delay and which aren't? Sometimes it will be obvious, but will it always? Of course not. So sometimes I have to tap before I know. And that is if I am an extremely experienced and knowledgeable user, which most people are not. For most people this will be just one more small piece of inconsistency in the pile of confusion that is Android.

Removing the delay 100% of the time, always, would be a big win.


context

I am 42 years old and I have never had to change a tire, not once.

context

Okay, but who is Dr. Drang? As nearly as I can tell from a web search, he is someone puts some code on Github and writes a blog. Has he sent rockets into space? Why does he feel like he gets to talk down to someone who has sent rockets into space, on the subject of engineering?

context

Yes. I was interested in the site, went to the page, saw the signup boxes, and left.

context

When I was in college, and shortly afterward, I was very much into "new programming paradigms" and would get excited about lazy evaluation or continuations or whatever was the new cool idea going around. I have designed and implemented several programming languages built around new / wacky features; the most recent of these was ten years ago.

What you are hearing now is not ignorance, it is experience. I am a tremendously better programmer than I was in those days, and the way I got better was not by getting excited about wacky ideas; it was by really noticing what really works, and what doesn't; by noticing what are the real problems that I encounter in complicated programming projects, rather than what inexperienced / pundit / academic programmers tell me the problems are.

Clearly you didn't really read my comment, though, since you are saying "If callbacks work for you in your job..." and my entire point is that callbacks are terrible.


context

What does "the batteries are too heavy" even mean? Too heavy for what?

Tesla just released a really big car with a heavy battery that goes 265 miles on a charge (under the very strict new EPA rating; at older methods of measuring range this number would be much higher). So I ask the batteries are too heavy for what exactly?

Also, if you are paying attention, you know that Tesla has just opened to the public a number of Supercharger stations that will charge the Model S battery 50% of the way in about 20 minutes. So your complaint of "wait hours for a charge" is already solved, today, in 2012, at least if you live in California (and Tesla plans to expand the Supercharger network rapidly).

There have been anti-EV arguments for years, but the arguments keep changing, which is how you know that EVs have won. The main argument used to be that EVs would never go far enough, that people would have too much range anxiety. That has been solved. Then the argument was that the cars are too expensive. That is in the process of being solved right now, as you see from Roadster->Model S. There are other problems but they are much smaller. "I park on the street so I don't have a place to charge my car" is not very hard to solve: it is obviously just a matter of will.


context

Tesla's EV batteries already in common use last a lot longer than that (7-10 years), so I am not sure what your point is.

context

ARticle looks completely fake? Picture looks Photoshopped, no actual non-generic humans referenced in the text? Link bait? Why is this here?

context

Which findings of Kurt Godel? Care to explain? Because for example this seems like it has nothing to do with the incompleteness theorems unless one doesn't understand the incompleteness theorems at all and is just hand-waving.

context

I don't know a lot about Square, but this article reads like a hit piece, wherein the author has an agenda to write a negative article about the company and to cherry-pick facts and bend descriptions to build the narrative.

All magazine-style writing is like this to some degree, but this one is just too extreme. It is basically junk writing.

Look, for example, at what Vinod Khosla says about the Starbucks deal, all of which is totally reasonable, but he is turned into some kind of guilty-person-in-detective-fiction with phrases like "he says defensively", which strengthen the desired narrative but have no relation to verifiable facts.


context

Dude you have NO idea what you are talking about. An occasional 10ms pause absolutely will "prevent that". Those of us who work on software that does a lot of rendering using these fabled GPUs you mention know that feeding the GPU properly is a big problem and requires a lot of code to be running on the, err, CPU.

I don't even know what you are talking about wrt network congestion. What are you talking about??


context

Sorry, but you are drastically underestimating the knowledge and experience of the author of the article for no other reason than to try and make yourself look smart, which is failing.

context

I want a language I would enjoy programming in. Rust seems to have a lot of what I want (GC not mandatory, perf, etc, etc) BUT private-by-default is a terrible idea. I started writing a Rust program and I was just typing pub pub pub all over the place. Ugly. That plus the excessive markup the language requires for safety are enough to put me off it.

Video game programmers desperately need a new language to replace C++, but I think this is not it, because the amount of friction added for safety reasons feels very high.

I'll keep an eye on the language as it evolves, though.


context

I don't think this is necessarily true. Yes, people leaving for keyboard-feature reasons are power users. But there are other keyboard reasons.

For example, iOS7 totally broke the keyboard so that fast touch-typists just can't type on it any more. This doesn't matter on something small like a phone, but on an iPad it is super-frustrating. Apple fixed it somewhat in recent updates but it's still broken in a few basic ways, all having to do with the fact that the person who programmed the new keyboard doesn't know how typing works.

It is confusing enough that a non-power-user probably doesn't have a clear idea why typing sucks now, they just know that letters don't come out with the right capitals any more, sometimes extra spaces show up, and damn that shift key is confusing. etc. It is not so much "Android is better" as it is "iOS is not a nice experience any more".

There are similar things to do with web browsing. Web browsing is supposed to be one of the few things these devices specialize in, but on my iPad Air it is terrible. If I go to a web site with images on it and scroll down, most of the images don't load for a LONG time, leaving me with mainly a blank page. When iOS7 came out the browser crashed all the damn time. Now that is mostly fixed but it still crashes sometimes.

Meanwhile there are all these timing-based gestures that are thrown off by browser performance problems, leading to bad user experiences. For example, I want to scroll down a page, so I press on the screen, drag, and release. Well oops, some piece of JavaScript ran right then or something, causing the browser's timing to get confused (maybe it counts frames and only a couple of frames went by), so instead of a drag it interprets this as a tap, which for some reason causes me to select some garbage on the page that I don't care about. Well now I am in select mode and things get only more confusing from there (especially if more performance hitches are happening). This happens all the time.

When you can't even scroll down a web page reliably, and yet that is one of the main use cases you are selling your product for, you can't claim it is a luxury product. You aren't delivering a luxury experience so you can't charge a premium.

Apple has gotten away with this in the past, in similar situations, though, because of newness and shininess. As Marc Andreesen pointed out recently, for the first few years you could barely even make a call on an iPhone, and when you did it was super-frustrating. But still it caught on. I think this is just because it was so new and exciting and there weren't real competitors yet. Once the bloom is off that rose, in order to be perceived as a premium brand, you have to actually deliver quality. But Apple is not delivering quality with the OS, they are just delivering some kind of skin-deep attempt of an appearance of quality.

I consider iOS7 to be a huge misstep and a giant missed opportunity, much bigger even than Siri or Maps. I am not sure if fixing iOS7 would solve all Apple's problems, but it is where I would start.

However, I don't think the thesis of this article -- lack of UI features -- would be my first step. Because I don't agree with the article that iOS 7 is simple. In reality it's a mess, it just tries to appear simple. So the first step is making it really, actually simple, and make it deliver a solid, quality experience. Then you can think about adding UI features, which I would claim people don't care about as much.


context

It's a neat idea, but by way of honest feedback my eyes started glazing over at "create this agent and that agent", and it became clear I don't want to add this complication to my life.

I think this would be vastly more interesting to many more people if the interface were just a line that says: "Tell me when [text field]" and I type something into the text field like "Tesla announces anything."

That is all I want to know, so why do you make me do a ton of stuff to implement a query? (Or more basically, why do i need to set this up on my own server? Isn't it way better if it is a service that you just run?)


context

I like the broad idea (applications should be very responsive). Yes, this is very important, but I can't shake the feeling that the author of the manifesto has very little experience with responsive software (or even with software generally).

For example, the manifesto confuses ends with means. It states a desired end, but then claims that certain means are required to get there (for example, "event-driven"). Maybe event-drivenness can come into play in a given system, maybe it shouldn't; across a broad set of domains this is orthogonal to the concept of responsiveness.

In video games, for example, we do things that are extremely responsive compared to web stuff (last week I worked on something that had to run at 200 frames per second in order to meet requirements). Interactive 3D rendering systems are most certainly not event-driven; they derive their responsiveness from cranking through everything as quickly as possible all the time.

There are lots of different domains of software out there and they all have found different local attractors with regard to what techniques work and produce the best result. Web software is just one of these domains, and frankly, it isn't doing so well in terms of quality compared to some of the other ones. So I think if one wants to write a manifesto like this, step one should be to get out of the Web bubble for a while and work hard in some other domains in order to get some breadth and find some real solutions to return with.


context

I do run a software company, and I would much rather hire someone with Penny Arcade on their resume than, say, Oracle or something.

context

Just pause for a moment on "everything that exists" because that is not necessarily a useful distinction.

If you think of a universe as "everything I could see/touch/etc given infinite time", someone outside the universe who was in control of it might record it with no problem because they are not subject to the constraints of the universe. e.g. imagine the universe as being like a computer simulation that you programmed. You can probably pause it in a debugger any time you want, look at whatever state variables you want, etc.

So in this case the definition cannot be "everything that exists" because you have to define "exists" in the case where different sets of things may have different levels of "reality".


context

Triple-buffering doesn't help that much, and it adds an extra frame of latency, which is really bad.

context

I have driven a Tesla Roadster for 3 years. The heater does not impact range much.

context

... though I guess if you look at the rationale behind all these features as "get people who buy into these features so tangled up in them that they can't use anything else, or reasonably port their applications to another OS", then it all makes sense.

Of course, that is the opposite of what I as a software developer want.


context

Here is my startup idea: an ad company that also allows people to register and pay a few pennies per article in order to turn off all ads for anyone participating in the network, and for a little bit of premium / exclusive content that nobody else gets access to.

You set the price high enough that sites will want to actively switch to your network, since the premium users are worth a little more than the free users. And you require as a condition of membership that 1 article in 50 (or whatever) is premium-only content.


context

My electric bill has increased by about $15 a month, depending on how much I drive. It is way cheaper than gas. (But the car is so expensive that you won't break even via cheaper fuel!)

context

I think this "electric grid can't handle it" is another piece of anti-EV FUD that is not representative of the situation in reality.

Right now, peak electricity usage times are during the day. This is why in most urban areas electricity is cheaper at night: they want to encourage you to distribute your usage more evenly throughout the day.

The NEMA 14-50 (a.k.a. standard appliance outlet that most people plug their dryers into) is a totally fine plug for an EV. It will charge the car up fully overnight. This plug is going to deliver you, at maximum, 40 amps at 220 volts.

Electric dryers often use something like 25 amps at 220 volts (of course it varies by machine). This is not far from the 40 amps we are talking about. So this whole "grid can't handle it" pseudo-panic is sort of like worrying that everyone is going to run their dryer at the same time, times 1.5, at off-peak hours. It is just not a big deal. FUD.


context

It is mainly an Xbox Live Arcade thing. They know they can apply exclusivity pressure to independent developers simply because enough of those developers will cave, since indies aren't generally willing to walk away from the deal.

context

... and this attitude is exactly the problem with "kids these days".

Society doesn't hold together by magic. It doesn't evolve in a mutually beneficial direction by magic. People have to make it happen.

Is it okay for some people to make little games, or for some people to want to selfishly make a little money? Sure; a robust society can tolerate that. It's when these attitudes become prevalent that the problems start to happen.


context

It seems likely from your reply that you have no education in matters related to vision. The human visual system uses other things too, yes, but silhouettes are one of the most dominant features because they are most invariant to lighting conditions. (Think about what would happen if you mostly depended on color, but now you are walking around at night when humans have poor color discrimination).

context

Just because Apple did it doesn't mean it's good. Whoever made that UI design decision did not understand the way human silhouette recognition works.

iOS icons are terrible. All the time on my iPhone I just find myself staring at the screen not knowing where anything is. This never happens to me on the PC.

It is just a bad style decision and I wish they would change it.


context

And then I guess the question is, if I avoid stuff like this, why am I posting on Hacker News? I guess the answer is that Hacker News used to be more in the vein of things I liked, but recently is too full of "be a man and launch your me-too web service" kinds of things. To which you might say I should stop reading this site, and to that I would likely agree.

context

I certainly welcome the attempts to reduce annotations and will look again at the language in a while.

It is interesting that the design priorities are so library-centric. I agree that library quality is important, but think about this: For every library that someone writes, N people are going to use that library in leaf programs, where if the library is useful N is much much greater than 1. So by definition the vast majority of code is not library code.

In video games we write programs that are between 200k and 2M lines, say. That is big enough that you do want to think about what API you are presenting from part of that program to another, but stability of that (internal) API is almost never a concern, and in fact this is one of the big boons of statically-typed languages: you can change a function and you instantly get errors at all the calling sites, allowing you to be sure you updated them (except in some bad ambiguous situations).

This fluid ability to change how parts of your program talk to each other is very important in large programs. It is one of the major ways you avoid code rot. The more friction is introduced into this process, the lower the ceiling on how large and complex a program you can build.

The other thing about games is that the problems are inherently very interconnected. Yes, we kind of partition things into different modules, but these modules all want to talk to each other very heavily (partly for inherent problem space reasons, partly for performance reasons). So again, friction introduced here hurts us more than it hurts most people.

I understand that private-by-default seems like the right idea because it supports a certain paradigm of programming that is generally considered the right thing (encapsulate your implementation details, modularize as much as possible, etc). But what I have found, in my field at least, is that this is less true than many people think. Yes, it's important to keep things clean, but fluid ability to change code is very important, and overmodularization hurts this. I think that some day this will become more widely understood, the same way that today most good programmers understand that OOP is not some kind of One True Panacea the way folks in the 90s seemed to think.

Good ideas can be carried too far and I think for my field private-by-default is way too far. Having to put 'pub' on every member of every struct and on well more than half my functions is kind of bananas.

(As someone who does not buy into the paradigm of OOP, I want to write functions that operate on data. In C++ sometimes I encapsulate these functions as struct members but this is just an aid to understanding when it is convenient; most functions are declared at file scope. In order to have a functions-operating-on-data mentality in Rust, I guess I need to put a pub in front of every member of every struct, which feels very distasteful; it feels like the language is pushing me toward paradigms that many in my field have tried and subsequently rejected after decades of pain.)

Well, this certainly went a few places, but that is where I am on these issues.


context

I don't think this is quite the right attitude but I still agree with the basic concept.

If you are building a high-end video game (my field), garbage collection simply makes the user experience poorer than it would be otherwise. It is not some invisible implementation factor, but in fact affects the output in the form of stuttering frame times all over the place.

Good luck trying to do a VR game with a reasonable amount of stuff on screen with a GC running (you need to be between 75-90fps times 2 eyes. And every time you miss the frame deadline in VR, it is not just that the game feels worse, but it disorients people and gives them headaches and nausea.)

Every time I pick up an Android phone and try to use it, I want to throw it out a window, because it is all stuttery and janky. I presume at least part of this is because of GC.

I think this is another occurrence of people improperly weighing obvious benefits versus hidden costs. The obvious benefit of GC is you don't have to manage your memory so much, maybe. (I say maybe because when you start saying that GC performance is not good, the answer is always thinking about memory management in some way in order to reduce load on the GC. "It's a really great car as long as you don't drive it very much.") The obvious benefit is there, but there is also a hidden cost in terms of performance and solid-feeling-ness, and that cost is really pretty large actually. The theory has always been "pretty soon now GCs will get better and this will go away", but this has never happened. I went to college in 1989-94 and used to design GC'd languages as a hobby, so I have witnessed a couple of decades of this.

As a productive working programmer who writes a lot of code that does complicated things, I do not find the memory management to be a large part of what takes my time. If I were to pull a not-scientifically-derived number out of a hat, I would say it takes less than 2% of my time. To get a 2% improvement in productivity, but to pay for it so heavily ("well, there's a class of results that are simply impossible for me to achieve now because the GC might go off at any time"), is just a really bad tradeoff.

I am sympathetic to the idea that some paradigms of programming (functional, etc) are harder to do without GC. Exploring those ways of programming is a good reason to like GC, but given that functional programming is not really quite here yet for most classes of large and demanding problems, well, it's just a very different world from the one I need to build software in today.


context

I just did it. It's not magic. I did have about a year of money saved up from previous jobs (relatively highly-paying contract work).

context

Yes, I mean I have to be doing my own projects 'professionally' wherein I decide what is being made and how it is being made. Not working for someone else, because that someone else is not going to have the same kinds of goals that I would have.

There may have been some jobs out there where I could happily and productively work, but I don't know, because I never encountered one.


context

This headset sounds terrible. Resolution is the least important factor in presence (of resolution, persistence, latency, frame rate, tracking). It sounds like they have bubkis for the other areas, and low standards besides.

Oh and what is that higher resolution going to do for latency and frame rate? Hmmmmmm.

It kind of shocks me how uncritically positive this article is. The situation to me reads differently, more like "any joker can plug a higher-res LCD into an Oculus DK1 spray painted white."


context

This article seems naive. Probably people are being paid to rate apps down.

context

... And what about the actual action of double-tapping? On some pages it zooms, and on some pages it does nothing, and in fact the first tap fires off a link. How is that simple, predictable, or solid-feeling? It isn't.

context

I would be careful with putting so much emphasis on legality. The fact is that there are so many laws, and some of them are so weird and convoluted, and nobody really understands them all; pretty much everyone does several illegal things every day without even realizing it:

http://www.amazon.com/Three-Felonies-Day-Target-Innocent/dp/...

Under these kinds of conditions, if someone in an appropriate branch of government wants to nail you for any reason, they can. Especially now that widespread spying makes it much easier to identify specific transgressions.

So I am not so sure why you would take such a hard line on legality when in fact such a stance is just waiting to come back and bite you (and everyone).

... In fact, now it is the government's position that there are SECRET LAWS that you can be violating but not even know why you are violating them; they can arrest you and not tell you exactly why they arrested you, because the reason is secret. How are you supposed to engage in strictly legal behavior when you don't even know what is legal and what is illegal?


context

If you believe this then you have never run a company. Being responsible for a company is categorically different from being an employee at a company, even if that employee is very hard-working.

context

I would say that said applicant is very clearly wrong.

Sure, there are a lot of people out there who exaggerate on their resume, but this has nothing to do with the existence of people who actually do know things.


context

Really I guess the reason I am replying is that I am disturbed by this repeatedly-brought-up notion that highly productive people do not exist. I am not sure why it disturbs me so much, I think maybe because if potential highly-productive people read these things and believe them, they may be demotivated from reaching their potential. Maybe, I don't know.

If you don't believe that highly-productive programmers exist, it is only because you haven't yet met one.


context

But isn't that how tech conferences usually work? And doesn't that result in the kind of gender ratio problem about which there is much dismay?

Agreed, they did this in a clumsy way, but hey, they are trying something.


context

From the perspective of their own frame, photons don't travel. They are everywhere along their path at once. From our perspective, a photon goes through A first, then B. From the photon's perspective, that is not how it is.

So the concept of a photon 'picking' a destination point, or not, is mired in an assumption that isn't true (that there would even be anything to pick).


context

I started on gdb in college in 1989-1993, so I have experience in both worlds. gdb is not a very good debugger.

context

... and on the battery chemistry issue, Teslas use Lithium-Ion batteries, just like we have been using in laptops for forever. They are not exotic.

context

Naah, all electric car batteries are replaceable in the sense you are talking about. "Swappable" means you can change them out in a short period of time (10 minutes or less) in lieu of recharging. Completely different concept.

context

Why is this being upvoted? It is a collection of generic PR statements with no new or insightful information.

context

I suggest you do some math on the actual facts, and try to figure out how much money the developer is actually making on a deal, before you brand someone a drama queen. I guarantee Phil Fish knows a lot more about the facts of the situation than you do.

(And this is not to deny that Phil Fish tends to have a lot of drama. I am just saying that to anyone in the game industry this kind of armchair quarterbacking is obviously uninformed, and then seeing someone attacked / blamed / whatever due to the conclusions of said armchair quarterbacking is just pretty sad. Speaking as someone who has been through this himself on multiple occasions.)


context

Does anyone have a link to the actual solution? It must be a pretty small equation. Searching around I am unable to find anything -- a testament to how useless Internet news is.

context

Thanks for noticing!

Some posters here have very weird perspectives. Yes, if someone wants to extrapolate some straw man, based not on statements in the article or evidence from the real world, but built from whatever feels easy to criticize thoughtlessly, then sure, it is easy to knock that straw man down. Whatever.

For what it's worth, I liked The Stanley Parable and had a nice chat with the author of the game at PAX last year. Why would anyone assume that something like this is not the case?

You guys do know that the subjects of articles you read on the internet are other real people also on the internet, right? Why would a poster here assume that I am some kind of inert punching bag rather than, you know, someone who's been on HN for a couple of years and involved in discussions?


context

Greg Egan has written a couple of less-jokey stories that deal with a related idea (that is more plausible).

The first story is "Luminous" from 1995, but I don't have a link to a free copy.

The second is "Dark Integers" which you can find here: http://www.asimovs.com/_issue_0805/DarkINtegers.shtml


context

Wow, this Schopenhauer quote is really painful to read. Given basic contemporary abstract thinking, it reads as completely naive.

This is not unique; I often feel this way when reading philosophy.

So I kind of understand where Hawking / Tyson / et al are coming from. I do think it is a mistake to single-handedly dismiss all of philosophy, and I agree with the sentiment in this article that physicists are following an implicit philosophy that they do not understand, so there's a contradiction there. At the same time, most philosophy is honestly pretty bad.


context

Well, you're being a bit revisionist.

You can get memory safety without GC, and a number of GC'd systems do not provide memory safety.

If you think that, for concurrent systems, it is a good idea to let deallocations pile up until some future time at which a lot of work has to be done to rediscover them, and during this discovery process ALL THREADS ARE FROZEN, then you have a different definition of concurrency than I do. Or something.

If you want to know about code clarity, then understanding what your program does with memory, and expressing that clearly in code rather than trying to sweep it under the rug, is really good for clarity. Try it sometime.


context

Very weird one-sided article containing no details of substance, like how the actual projects are going and what has been implemented.

context

I don't even.

context

If you remove the constraint "in a web browser" then there is nothing here at all to talk about. Making an acronym for stuff people have been doing routinely since the 1970s is kind of weird. This is the problem with web programmers.

context

Indeed. The article ignores the history of success in the face of comments just like this one.

Example: Many, many naysayers said the idea that Tesla could begin shipping the Model S to customers in 2012 was absurd, that they were naive and didn't understand the complexities of building a vehicle like Detroit does, etc. Well, they just did it.

Before that, everyone said nobody would buy electric vehicles because (a) you can't get them over 100 miles in range, so nobody will want them, and (b) because there is no charging infrastructure. So Tesla just built bigger/better batteries and built a charging infrastructure.

The problem with articles like this is they are just some random guy saying stuff and it doesn't matter to him ultimately if what he is saying is correct.


context

The main objection is that the way edge cases would "differ from the real thing" depends on what their "computers" are like which may not be anything like our computers. In fact there is no reason to believe they are similar at all.

context

As soon as flexible screens become a thing, your phone and your tablet become the same device; you just unfold it when you want a big screen.

In light of this, the size argument being made in this article is not really meaningful in the medium-to-long term, unless these kinds of flexible screens never happen. (But they are being actively worked on, so.)


context

Not a carefully-written article. When you buy stock in a public company, that company is not getting your money -- the seller of the stock is! So if you buy a Pepsi stock or a Shell stock, you are investing in that stock, but not investing in the company in a traditional sense (in which they get money from you and expect to pay out more money later, with the surplus coming from their activities).

[There is the small factor that the company usually owns some of the stock and demand for the stock usually pushes the price up slightly, the company's stock assets do benefit slightly in a case like this, but this doesn't matter unless they sell, which is a rare activity compared to general volume of investment.]

Since the article does not differentiate between stocks, bonds, and loans, it is essentially useless. I guess they didn't care enough to differentiate, and just wanted to say bad stuff about the foundation; or, maybe they just don't understand finance much at all, and didn't think about it.


context

I don't know why you think that?

a|+-> + b|-+> is a very special-case entanglement. Yes, you'll see that one mentioned in Wikipedia (and quantum cryptography and etc) because it's very simple.

The general form for the states of two photons is exactly as you have listed for s_n. For some coefficient values of a, b, c, d, the photons are "entangled", for some, they are not "entangled". How do you know which is which? It is whether you can factor the polynomial into two separate expressions of the form (x|+> + y|->). If you can do this, the photons are not entangled; if you cannot do it, they are entangled.

Since most sets of (a, b, c, d) represent unfactorable expressions, you would expect almost any expression chosen at random to represent an entangled pair. Clean un-entanglement is the rare exception.


context

Not sarcasm at all.

context

Well, I don't know that I would recommend this particular job. But I do know that the time I learned the most, and became a truly good programmer, was just a couple of years after college, when I worked my ass off doing very hard things that were initially beyond my ability. It was a company I started with a friend from college, and ultimately the company failed, but it was a tremendous learning experience.

Programming skill, at least, is like compound interest. The more you program, the better you will be in the future, which means you will learn faster in the future, etc. At a big company or in an undemanding situation, your pace of learning is pretty slow, being limited by the circumstances around you; and like any compound interest, if you get behind, it becomes pretty hard to catch up to where you would have been.

Whereas in a small-company situation or any situation where you are limited only by what you are physically and mentally capable of doing, you are learning as fast as possible. It is very good.

You could say that the Penny Arcade job is not as good as starting your own startup and working that hard, and maybe that is true; but my company shut down and left me $100k in debt (back in 2000 when $100k was real money!) whereas when you get paid, you are not taking the same risk. Maybe this also means it is psychologically difficult to work as hard. I don't know.


context

It sounds like you are arguing with me, but everything you said is agreeing with what I said, so I am not sure what to say here.

context

I first learned this in a book by a prominent physicist, so I am inclined to think it is an accurate reading of the physics. Unfortunately I do not remember which so I cannot give you a citation (it was possibly Brian Greene or Frank Wilczek), but maybe there will be a physicist reading this thread who can chime in.

The angular movement thing is this: Imagine you have a frame of reference on the tip of your nose. The X axis points straight away from your face, the Y axis is to your left, Z is up. Now start turning your head to the left. To a tiny observer living on the tip of your nose, the relative speed along the Y axis of a faraway planet has suddenly become very high. The further the planet, the faster the speed (this part is just grade-school geometry).


context

There is no point to saying something is "outside" and something is "inside" in the extreme cases of the example you just gave, which may or may not be the case. What if the outer universe is not a random universe? What if its laws of nature are so unlike ours that we do not know what to say about it? (Maybe it does not have space or time, but it has other things instead.)

But I think this is a little bit beside the point, because as I mentioned in my first post, all of this is predicated on us adopting the nave view of time in our own universe, which is maybe not the best idea, given that we have plenty of evidence otherwise.

(And anyway, if you believe that "everything that exists" can be infinite, then there is no sense in saying our universe is being produced by some kind of simulation controlled in one particular other place, because of course it is, but actually this is probably happening in an infinite number of different ways from different places such that there is not really point any more in claiming that this is happening at all; the situation becomes such that you could just draw a relation between situation X and situation Y and could state that one could be causal of the other from a certain point of view.)


context

Even in the bay area, to most normal people, it comes off as silly. It only seems 'reasonable' among startup founders younger than 30.

context

This is a bad trend, but eliminating trackpad buttons is probably worse. (The fact that most people don't realize it is worse makes it even worse!)

context

Hofstadter's stuff is not remotely GOFAI. Have you read his books or looked at any of the projects?

context

Well, I left the start of an explanation in another reply. But the problem is that you are asking a question where it takes years to really understand the answer, and certainly hours for a commenter to write a summary. For someone to put that much effort in, they have to be motivated to put in the effort; but you are coming across in a very unpleasant way, and not offering anything in return, so why would anyone put in the effort just for you?

context

Look into what is involved in modern 3D rendering of high-detail scenes. It is NON-TRIVIAL, and I can tell you this as someone who has done 3D programming for 17 years.

Pro Tip: If an entire industry of experienced people finds something very hard, and you don't know anything about the topic but you don't see why it would be hard, maybe the relevant factor here is the "you don't know."

It reminds me of my mom who said on multiple occasions "All these rockets are dangerous and they explode; I don't see why the scientists don't just use the majestic forces that keep the planets in their orbits to move the rocket."


context

You may want to read up further on Fermat's principle.

Your response is a little bit ill-formed because Fermat's principle is about the time to travel between two points. You assert that light is not "taking the shorter path", but in doing so you are changing the destination point or else leaving it undefined. Instead, pick a start point, pick an end point, and see how light travels between those two points, with respect to your cube of glass.


context

I used to buy ThinkPads all the time. But they just keep getting worse (along with all other Windows laptops).

As for the new Thinkpad described in this article... lack of physical mouse buttons = instant fail. This seems like a weird style-oriented consumer move, not a business-user kind of move. I don't understand why everyone is in such a hurry to try and copy the MacBook (badly).

At this point, availability of physical mouse buttons is very high in my list of selling points for a laptop. I doubt I am alone.


context

P.S. If there were an easy way for me to pay a nickel per article I read (or however much), instead of having ads all over the internet, I would jump at that in an instant.

context

No, actually, they got fucked pretty hard. What you see in the movie is only about 10% of what happened -- but you are also misinterpreting what you saw in the movie, which is already pretty bad.

context

Costs of stuff like rent and cars are normalized to average income levels. (Though of course with cars there is a base cost of manufacture that makes them less flexible than something like rent).

Which is to say, if everyone works 25 hours a week, rents will go down (because if they don't, you have most properties sitting empty). If instead everyone works 60 hours a week, rents go up.

Of course, economics is really complicated and rarely works out this simply. But that's the idea. It's a little silly to post here presuming that someone in Mr. Vaupel's position has somehow not thought of the fact that when you work fewer hours your nominal earnings go down. (I think this qualifies for what pg was calling Middlebrow Dismissal). It's a more reasonable response to say, well, of course he has thought of that, but I wonder what the answer is?


context

I think we have different ideas about what constitutes optimization.

Sure, putting const in parameter declarations is easy to do. It may even buy you a little bit of speed because the compiler is a little bit clearer about pointer aliasing and whatever. But it's not going to make a difference in the equivalence classes of slow code / fast code / Really Fast Code.

Serious hardcore optimization usually involves changing the way the problem is solved to something different than the way the old code thought about it: either constraining the problem space further, or attacking it from a different direction. This usually involves rewriting everything since there are so many cross-cutting concerns. Sometimes one has to do this several times to figure out which way is really fastest. Microoptimization things, like whether you used const somewhere or not, are much smaller details that have correspondingly small effects.

For code that one isn't specifically optimizing, speed probably doesn't matter. There was an exception to this, where we hit a little bit of a bump in the late-2000s on platforms with in-order CPUs like the PlayStation 3 and Xbox 360, because they have such a high penalty on cache misses; this tended to make general-purpose code slower and result in much flatter profiles. But now we are pretty much out of that era.

In general, const is more of a protection than an optimization. This is especially true heading into the massively parallel future, where const just sort of tells you whether some code is known for sure to run safely in parallel or not... and running safely in parallel matters tremendously more to overall speed than the number of instructions in that bit of code, or whatever. (Anyway, C++ is not at all a viable language in the massively parallel future... so that is going to be interesting.)


context

Yeah, I was mildly sympathetic until it got to the part where they said in passing "oh, unfortunately, we just sort of liquidated the company."

context

I am pretty sure they at least need to sign it, which is why the common (overreaching and abusive) employment contracts include a clause saying that the employee will facilitate the filing of such paperwork.

I think naming and shaming is a good tactic. People should not be signing contracts with clauses like that in the first place, and if there is potential for future shame, maybe it'll make them think a bit more about it.


context

Hence my caveat about not being able to avoid it if an API forces you to use them.

But if I were making a replacement language that runs in the browser, among the highest priorities would be to make it not work via callbacks.


context

The "Supercharger" already works pretty well. It's the first attempt at such a thing. I would expect that maybe subsequent attempts would be even better?

So I don't understand why you are saying "the only way they can possibly make sense" is if the batteries are swappable. There is already sort of an existence proof otherwise, in California, right now.


context

And the point is that it's not about modeling!

What happens is that we invent crazy math that is not supposed to have applicability, then some years go by and it's like, oh my God, quantum mechanics is somehow exactly all about the operation of unit Hermitian matrices... how crazy is that?? etc.

If it were some kind of model that we are able to successively refine, the progress of discovery would look something like a Taylor series, and it would be no surprise that we are eventually able to model phenomena within some tolerance epsilon.

But that is not what is happening! Rather, it's that we discover that some large and sophisticated piece of math, for which we had not thought of any particular applicability, turns out to exactly represent specific advanced physical phenomena. This happens over and over.


context

No.

The surprising thing is that the models generate predictions far beyond the domains they were designed for (and far beyond the original knowledge of the people making the models), and that the predictions are so mindbogglingly accurate that there seems to be Something Else going on.

See the Unreasonable Effectiveness of Mathematics link below.


context

I have no objection to the actual existence of the article. I just think it's important for people to understand this fact, that seems to be increasingly misunderstood these days, that "technology" is about expanding the frontiers of human knowledge. Just because someone types program-like things into a computer doesn't mean what he is typing is technology.

If the defining property of your activity is that you are trying to negotiate messes that other people have made in order to make things happen, where the things you are making happen are not novel in themselves, that is pretty much what working in a bureaucracy is like. So you can think of it as "working in a vast decentralized computer bureaucracy" rather than "working in tech".


context

While I agree with much of what you are saying, I disagree strongly with the last sentence. I think if one feels that a certain career path is petty and selfish, one must make that clear. The reason is that, aggregate over large populations and years of time, societal notions of dignity and respect toward certain professions, and disrespect toward others, have a substantial influence on whether people enter those professions and what kinds of choices they make once engaged in them.

Of course this is a mechanism that goes horribly wrong in some societies, but it's our job to make ours as good as we are able.


context

(Though here's a link to a synopsis of Luminous): http://kasmana.people.cofc.edu/MATHFICT/mfview.php?callnumbe...

context

Wait, what? Is there no such thing as someone who has enough experience that they are allowed to speak directly and clearly about an issue? Rather, you think people are supposed to be all mealy-mouthed so that you don't feel like they are too presumptuous?

That doesn't make sense. There are people in the world who are worth listening to. The "democratic" nature of the web means that a lot of people post a lot of crap, and maybe some people are so used to reading crap that they have just forgotten what it's like when people really know what they are talking about. I don't know.

There's a huge difference between Rails Bros and smart guys who attack the hardest problems they can, whenever they can. Charles is one of the latter people and has been for, I don't know, 16 years? I don't want to live in a world where someone like that is not permitted to speak in an uncushioned way.


context

(In case it is not crystal clear, the whole point of the article is that he used to be a Mature Programmer a while ago, and is now in a post-Mature-Programmer phase.)

context

I also don't see any way to search for non-economy-class flights...

context

I stopped reading early, when he's claiming pi is better for the area of a circle, because that revealed that the author hasn't really thought about this very much.

If you look at the equations for the volumes of spheres in n dimensions (with 2D being just one of them), tau shows a clean pattern. pi leaves you with a mess.


context

In the Guessing Game, you suggest using abs() to get a nonnegative number, but this is wrong, since it would make 0 less probable than all the other values.

context

Extrapolation is not at all the same thing. When you aren't getting network packets, yes, you run locally to do some kind of approximation to what is happening (in fact for a good game, your frame rate is never contingent upon what is coming in over the network, you just always do your best to display whatever info you have right now).

But that is totally different than the GC situation. With a STW GC you cannot even run, how do you expect to be able to display anything on the screen? Even with a non-STW GC, the reason the GC has to collect is because you are running out of memory (unless you are massively over-provisioned), and if you are out of memory for the moment how are you going to compute things in order to put stuff on the screen?

Accessing disk/network/etc induces latency, yes, but that is why you write your program to be asynchronous to those operations! But this is a totally different case than with GC. To be totally asynchronous to GC, you would need to be asynchronous to your own memory accesses, which is a logical impossibility. I do not see how you even remotely think you can get away with drawing an analogy between these two situations.


context

And video games.

And avionics/aerospace.

And self-driving cars. And medical equipment.

etc, etc. You can list lots of fields for which this is unacceptable, and they are a lot of the really interesting fields.


context

"Obviously"? Really, is it obvious? You aren't drawn to question the supposed superiority of this software even for a second?

context

He obviously did not try the demo. The demo meets all specs that Mike listed in his talk and it exists today, thus nullifying the entire thrust of the article.

The demo is not consumer-product-ready, as it is currently expensive and the tracking requires you to paper your room. But those things are all solvable (inevitably with a bit of time, which is why Mike said 'within two years').

A number of the key people responsible for the Valve work are now at Oculus with a mandate to consumerize the best tech possible. So yeah, it is going to happen. Do not believe this article. Oculus CV1 may not get all the way there but it will be better than is necessary to show clearly the writing on the wall (the main factor in its quality being cost, which the Facebook deal helped tremendously with). CV2 will, I expect, be pretty badass.

(I have tried the Valve demo and then spent a few days working with my game on their hardware.)


context

Untrue, because the reason the microtubule coherence was posited in the first place, at all, was as an origin of consciousness. There wasn't another reason to presuppose it. So the fact that the activity exists does have information content with respect to the central thesis. It certainly doesn't prove it, but it moves the needle.

Analogy: Suppose the Earth is covered with clouds and we have never seen the sky, we have not invented space ships yet, etc. Nobody knows why tides happen. Someday someone predicts there is a GIANT rock orbiting the Earth not too far away, and everyone says that is crazy. But eventually you send a rocket up with a camera, and you see this giant rock there! Whoa. Since the tides caused you to look for the rock, and you found the rock, you have reason to suspect the rock does cause the tides. Maybe it doesn't -- further verification is required. But the big rock is evidence, it moves the needle. That is what science is, is making testable predictions and then testing them and then letting the results of those tests help you understand what is going on in the world.

Invoking the FSM or discounting evidence, because it doesn't match preconceived notions, is in fact the kind of thing that is the bane of science and always looks embarrassing / shameful in retrospect. I would hope that people at HN understand science well enough to see this pattern and not participate.

P.S. Re the "according to the article's narrative" snipe, uhh, some of us have been following this issue since the 90s when the idea was proposed. "You guys are crazy" is an accurate description of the majority consensus.


context

Start the Nest shutdown clock ... 18 months?

context

Did you talk to any Korean citizens while you were there? All the ones I walked to wanted us gone.

context

Of course not, but look: you manage the investment arm of a charity fund and your job is to provide more income for that fund in the future.

If you put money into stocks, and get a return on that, that return means very concrete things: more people who get polio vaccines, more people who get dewormed, etc, etc. Very concrete results.

If you don't put money into stocks, because they are unethical ... well, how exactly? As we have mentioned the stock has already been sold. So, you don't do it because by doing so you are helping support a market system that may provide a setting for future unethical corporations to arise in and go public? (Along with many ethical corporations?) Or because by providing demand for the stock now, you retrocausally made it possible for that company to have IPO'd years or decades ago?

These are very abstract and weird concerns when your job is very directly to help people.

You say "if unethical corporations faced divestiture", but even a large fund like this one has no say in that. If they don't invest in McDonald's or Walmart or whatever, it doesn't matter in the grand scheme of things.

So seriously, if you were running the fund, what would you do? For real. If you make less money, actual human beings die who would not have died, in volume.

(All of this said, I would feel pretty dirty about buying the stock of a prison company, anyway. But railing on them for stuff like McDonald's and Walmart just ... doesn't make sense.)


context

Example: Everyone on the terrorist watchlist gets a face photo associated with the name. Then run face detection software in the booth and don't let the traveler out if it matches.

That is a pretty straightforward next step from what is there now. I am not sure how likely it is, but if you claim you can't possibly think of abuses, then you just aren't thinking very hard.


context

But let me also just say that you do not have to believe this at all in order to believe my original point; I just brought it up as an extreme crazy case.

So if you don't believe the extreme crazy case, think of the standard example: you get into a space ship or something and zip around really fast. And you are thinking about things closer to you. The math says the same thing: as your light cone changes, the set of spacetime intervals, that have time distance 0 from you, changes as well. So from your relative position the "now" at these faraway points goes back and forth. This is basic, basic relativity.


context

A sizable contingent disagrees with this statement. For example, Tim Noakes says "never prescribe a statin to a loved one".

context

It is somewhat infeasible to use linear colorspace, because you need a lot more precision in order to do this without banding. You end up with substantially bigger texture maps, possibly twice as big; but actually it ends up being a lot more than twice as big, more like 8x or 12x, because the compressed-texture formats that graphics chips are able to decompress onboard do not support this extra precision. So if you were to try using something like S3TC compression to reduce this bloat, the result would be really ugly.

In general, games only use light-linear texture maps when they also need HDR in the source texture, which is not that often. Ideally it is "purest" to use HDR for all source textures, but nobody does this because of the big impact on data size. (And even for the few textures that are stored as HDR source, often the data will not be integer, but some other slightly-more-complex encoding.)

[Claimer: I have been making 3D games professionally for 17 years, so this information is accurate.]


context

Yes, it is not possible. Learn about what a Z Buffer is and how it works.

Dude seriously, I have been doing this a long time. I am going to stop replying after saying just one more thing:

If you are running on an OS like Windows (which this product is targeted at), you do realize that the OS can just preempt you at any time and not let you run? How do you predict if you are going to finish a frame if you don't even know how much you will be able to run between now and the end of the frame?


context

See my Rocket comment above. But in reply to this specific comment I will drop you a hint (this hint is still just a small piece of the whole situation):

3D rendering is so deeply pipelined that it is difficult or even impossible for the program to know if a frame render is going to finish on time. It takes a long time to get information about completed results back from a GPU; on PCs you almost certainly can't get that info during the same frame you are rendering, unless you are rendering tremendously slowly.

In order to make an estimate about whether the frame is going to be done in time, you would have to guess. Okay, then, so now you decided to stop rendering this frame, what do you do? Leave a giant hole in the scene? Turn off the postprocess? Draw low-detail versions of some things (hint: still very slow)?

Your program does not even really know for sure which pieces of the scene are fast to render and which are slow. It does not know if specific textures are going to be paged out of VRAM by the time you get to a specific mesh, or not. etc etc


context

There's no nicer way to say this: you're scum. Have fun trying to sleep at night.

context

The author of this article does not seem to understand what good PR agencies do. He thinks PR means 'web spam'. A good PR agency is about human connections: they will get press interested in doing a story, they will get you live interviews, they will provide very helpful advice on how a message will be received.

Article is alarmist and weird. Not recommended.


context

How does this not have more upvotes?

context

I don't understand why people think this is going to be useful.

The inbuilt assumption is that people care enough about the data to want to search it frequently and thoroughly. I don't think that is true. Facebook is mostly ephemeral junk data that you don't care about; this has been true ever since they changed their UI to the Twitter-style "what are you thinking right now?" input / streaming.

In order for search to be useful they first have to backtrack heavily on what their entire platform is about. Which would be hard.


context

Creepy reply is creepy.

About this:

"Plus, I mean, what if I revealed myself to you, and then you were like, oh shit, I better take what he says a little bit more seriously, wouldn't that just be embarrassing? I don't want to do that to you."

No, by all means, go ahead. I am interested in having a productive discussion about programming, so if you can share your experience in a way that convinces me, I am totally open to it. If it turns out I am wrong, I won't be embarrassed, I will just change my opinion so that I am not wrong any more. This is how one becomes a good programmer in the first place: by paying attention to what is empirically true, rather than what one is originally taught or what seems exciting or what is in theory better.


context

This is true (especially: breaking vectorization). But premature optimization is not a good idea, and if you are doing real optimization, you are going to rewrite that piece of code 10 times anyway, so it is in a different class of problem and the putty stuff I was saying before does not apply (i.e. this code is in the 1% or so of the codebase that is highly performance-sensitive).

Optimized code is just a different thing from general code (if one is a productive programmer).


context

I disagree. I have 16 years' experience programming in C++ (and substantial experience in other languages before that), and I find that an important factor in code clarity is not writing stuff like this.

Yes, you can understand it, but it takes more brainpower to do so than it should, especially once you get beyond trivial cases. It is much better just to write it the long way.


context

Also, no, I don't believe GUIs are inherently difficult. I do think most GUI libraries are just terrible though, because they have bought into bad GUI paradigms.

If a GUI is your example of something that is difficult, we are just living in different worlds and it's a challenge to have a productive conversation. I think a difficult task is something like "make this ambitious AAA game run on the PlayStation 3 performantly". That is pretty hard.


context

It depends on what the application looks like. The most straightforward and robust thing is to block on events. But if you are doing tons of this kind of thing, and the data is relatively self-contained and packageable, then I would do something like spawn a worker thread that gets the data and then puts the data into a result list (that, again, the main program blocks on).

context

I don't see it as fundamentally different. Callback means some or all of: "I don't know when or where I am being called from, or what the state is of the rest of the program at this time." All of those are bad things if you are trying to write robust software, so you want to avoid them unless there's a really good reason.

context

Callback Hell is certainly a real thing. I decided 12 years ago that I would never use callbacks if I could avoid it (the only wai you can't avoid it is if an API forces you to use them); I have never looked back. Regular, simple, straightforward imperative flow control is a very powerful thing, and any time you give it up or make it more squishy and indirect, you had better be getting something big in return. Usually you aren't.

That said, what the article proposes as a solution is bananas. You don't need to do crazy functional acronym things; just don't use callbacks. Good C/C++ programmers in the field where I work (video games) do this all the time. It's not hard except that it requires a little bit of discipline toward simplicity (which is not something exhibited by this article!)


context

The Model S solves the issue via the ability to charge at Tesla's "supercharger" stations (much faster recharge than any other EV), and via the supercharger network that Tesla is currently building (some stations in California are currently open, and they recently presented their expansion plan for this network).

context

Oh, hey dude.

40 amps at 200v will completely charge a Roadster in something like 7 hours. I don't know how long for a Model S, but it is probably longer. I am only familiar with the day-in, day-out of the Roadster so I will stick to that mostly.

So the "charge the car all night (7-8 hours) scenario" only makes sense if you need to charge up the battery 100%, i.e. you were on fumes before you plugged it in, which means you drove 200-240 miles the day before. This may be true for some people but it is not going to be true for most people most of the time.

I drive from SF to Berkeley and back most work days, a 25-minute commute each way, and I like to drive it like a sports car, so I use relatively a lot of power for that length of trip. Generally I use about 20% of the Roadster's battery on such a day, so that's probably about 1.5 hours to charge in a 220V outlet. If everyone did that, you would probably want to stagger the charging times, but it is totally doable even with current setups.

I do think that there would be some increased power draw and that we would want to beef up our electricity infrastructure a little. But that is pretty different than what I see as a Republican talking point, which is something like "there is no way that the USA can support everyone plugging in their EV, it's impossible." That is not my experience as an EV driver.

When I was coming up with the dryer analogy I was just using numbers I pulled off the Internet about what people were measuring their dryer's pull at. It's possible dryers just are not very efficient, I don't know! (Though I thought the whole Energy Star thing was supposed to put pressure on that).


context

Loss-leading is for losers. I mean, it works if you are a large entity with super-deep pockets and a long horizon before you need to make your money back. For small developers it is death.

The first dot-com bubble was all about user-acquisition and loss-leading. Look where that ended up.


context

I don't have much patience for articles like this.

Look, if you can't get someone to open your app more than 3 times, maybe it's because you are not making interesting things.

If you build something that is high-quality and that makes a difference in peoples' lives, they will flock to you, because there is so little of that. On the other hand, there is an overabundance of 99-cent and free-to-play software that has no real reason to exist other than to try and make the developers money. But this is the mindset that this article comes from (evidence: the author's first three pieces of advice are "cross-promote", "market well", and "internationalize", things that have nothing whatsoever to do with what is being made or how good it is.

These are not useful tactics, at least not in isolation, because they don't address the core problem: that a developer in this situation is a dime-a-dozen. The solution is to stop being a dime-a-dozen.

You don't need to differentiate your particular piece of software so much; work on differentiating yourself as a developer, in terms of the quality and interestingness of what you produce; be a thought-leader rather than a follower who mainly thinks about cross-promoting; and once you manage these things, you will automatically be doing okay. (And you are much more likely to be satisfied with your life, which is quite a nice side-benefit).

Of course, most people will not follow this advice, because it requires effort, introspection, course-changing, all that stuff. Most iOS developers will continue going as they are and continue suffering the consequences. Not pretty but that is just how it is!


context

I second this. Ace Hotel is rad.

context

As a Roadster owner of two years, who uses the car as his daily driver, I can tell you this is completely untrue (the Berkeley-MV thing).

You can drive from SF to Santa Cruz and back on one charge.

You can drive from SF to Mountain View and back, and then there and back again, on one charge.

I only need to think about recharging if I am going on long road trips -- say, SF to Lake Tahoe. This is exactly where the Supercharger comes in and Tesla's announcement is way huger than I expected. It makes me want a Model S even though I have been thoroughly delighted with the Roadster for two years.


context

I find this attitude moderately disgusting and I would decline to do business with anyone who thinks about the world this way.

The reason is: it shows a worldview where everything is about your personal advantage and not making things bad for yourself. It comes across as selfish and petty. That final paragraph is just kind of gross.

I want to do business with / hire / socialize with people who care about the good of the world, hopefully more than they care about their own small situation. It's hard to describe what that looks like -- it is different for everyone -- but it almost certainly does not look like this.


context

If I were a hard drive manufacturer, and I saw the inevitable swooping-in of SSDs to replace my mainstream market, I would enter a strategy of taking profit now, because there won't be much profit to take on these devices in the future.

context

I agree that it is sort of good-natured rivalry, but it sort of isn't. I went to Berkeley and know that it is true.

context

I guess there is a max reply depth? So I am replying here...

I am a game developer, actually, and I believe games can have great social value. So I support you in pursuing your idea.

"Rewards" systems like the one described here, though, are not about giving anything to the audience. They are purely about taking money away from people, and doing it as manipulatively and sneakily as possible. I believe the net social value for things like this is deeply negative.


context

You are making a lot of weird assumptions. I do have surroundings appropriate to me, and a circle of peers who have a value system I find interesting, etc, etc.

What I am talking about is what I perceive to be the mindset of a great many young people entering the working world today. This has nothing to do with where I hang out (indeed, I avoid stuff like that, actively!) YC Demo Day, I went to because I didn't have the full picture of what it was like. Now that I know, I have not gone back.


context

San Francisco, CA

Thekla, Inc is hiring good video game programmers to work on unusual and very interesting games.

See http://the-witness.net/news/?p=815


context

Availability of 40ms out of every 50ms... how many 9s is that? Oh wait.

"Go, the language with zero nines of availability."


context

Upvoted because it's true.\nHowever, it is unlikely an HN-style site would fare better. But who knows.

context

(The language implementation can of course engage in strategies to avoid a full array copy, that you as the user have little insight into, but these are often questionable as they slow down run time in other cases, and anyway, they can only mitigate this problem, which is not going to go away.)

context

"This isn't a problem"? You're so sure?

How many nodes are there? Let's presume there are a lot, otherwise the problem is trivial and speed doesn't matter anyway. So now you have an N-long immutable array that you are copying every time you want to change a node pointer? So you have changed the operation of pointer-changing from O(1) to O(N)? What does that do to the run time of the algorithm?

Also, your garbage velocity just went WAY up. What does this do for the runtime of your program generally?


context

He also promised to close Guantanamo.

context

It works because when you die it costs them way less than $500k to dispute the situation and avoid paying you anything.

context

The issue isn't just kids, it is the way these games are designed to prey on peoples' psychology the way gambling does.

See for example the factoid that 0.15% of players spend over 50% of the money in f2p games (and these games make tons of money):

http://www.gamesindustry.biz/articles/2014-02-26-15-percent-...


context

In the words of the immortal George Carlin:

http://www.youtube.com/watch?v=7W33HRc1A6c


context

To my recollection, most people disagreed with said statement. The thought was that any kind of quantum coherence would not hold for a nonnegligible time, because hey, decoherence. This was before we started seeing proof of quantum effects biology at all.

context

Indeed, such a lousy article. The stock is still net up almost 50% in THREE MONTHS and yet somehow he is writing some doomsday scenario article about it.

It reminds me of the crappy articles talking about how TSLA was down and doing terrible, when the price was around 25, months after it IPO'd at 17.

I wish I could downvote this article.


context

Are you kidding? This is well known and the administration even admits it.

For example:

https://www.commondreams.org/headline/2013/11/27-2


context

And actually, yeah, forget about the relative speed, since that is not central to the real issue. The real issue is just that when your light cone changes, the set of points that are simultaneous with you changes (all these points are outside your light cone except for the one you occupy). You cannot observe them directly because they are outside your light cone.

context

Did you read the OP? The whole question is about whether the apparent randomness in quantum mechanics means the universe is fundamentally nondeterministic. We cannot make accurate predictions of measurements of quantum state apart from saying things like "A will happen with some probability, otherwise B will happen". It is that "with some probability" that is the question.

(Yes, at the level of the macroscopic world we can make consistently accurate predictions, but physicists would say this is because we are operating at a scale where the statistical nature of quantum mechanics averages out, etc. Unless you are Carver Mead or one of the other wave guide kinds of guys who believe there actually is no randomness in QM and it is deterministic all the way down.)


context

When you get in your car to go to work, you are endangering peoples' lives. So, uhh... yeah... this is a justification that can be stretched quite far.

context

Airbnb is for people with more time than money. Hotels are for people with more money than time (they do not want to spend time chatting to arraange a booking, or to set up social proof for themselves on yet another social network just so that people will consider them).

My one Airbnb experience was massively negative because I value my time.

Hotels are not remotely doomed. Demand for them may go down, though.


context

I am not sure you can say this when nobody ever even manufactured such a screen as an experiment.

context

Actually, the effect happens just as much during downscaling. It is just not as noticeable, because your artifacts happen at subpixel scale. But they do affect the output colors.

The math does not really know the difference between upscaling and downscaling.


context

Among the potential solutions, he doesn't mention reversible computing, which I found weird, because the whole point of reversible computing is drastic reduction in power draw.

Maybe he doesn't think it would ever be fast enough, or maybe he thinks it can only apply at really small scales (i.e. nanotech) and there's no smooth incremental path there from where we are now?


context

You are correct. The whole reason things are set up this way is historical, compatibility with CRTs.

context

If you can't handle arbitrary-length strings, you pretty much don't know how to program. Really.

context

Yeah, this discussion went a lot better than the other one! Certainly more enjoyable, anyway.

context

Not allowing other code in is a bad thing. See my putty reply above.

context

You don't seem to provide any context about what the circle of fifths is, or even what a fifth is for, thus I have no idea why I should care, and in general it makes it very hard to assimilate the article in a useful way.

context

As someone who worked calmly for years and made a quite reasonable amount of money, while retaining full control over everything, I must recommend the author's approach.

context

I don't believe that parent/child relationships are a good way to structure programs. I use them sometimes, but very rarely. (Current codebase is 180k lines of C++).

context

Nope, because you can do static analysis (a.k.a search through your program text) to find out who calls a function and when. The whole point of a callback is that this doesn't work.

context

Okay, so what if a particular gas station can't supply 100kW, so what? You charge at 50kW? That is definitely less convenient, but still feasible for filling up your battery and getting you where you want to go. It's still infrastructure that works, so the anti-EV argument "there's no infrastructure and the infrastructure can never be built" is still obviously wrong. It's also still obviously wrong if you drop the charging to 25kW. etc, etc.

Besides, do you know that a typical gas station can't readily supply 100kW? How do you know? Has anyone even thought about this seriously (except Tesla)?

I am a little bit shocked by the amount of specious naysaying that is happening in this thread. This is supposed to be Hacker News, where people are motivated to really think about problems, to build solutions, etc, etc. I don't see any of that attitude in some of these replies.


context

Okay, but ... did anyone ever say that it would save you money? I don't see that in the thread anywhere.

I certainly wouldn't make that argument for cars of today. But I see how rapidly the price is coming down, and I see what price targets Tesla is aiming for, and that is pretty interesting.

(And if you really want to compare price, it's probably a good idea to try to account for externalities. It's hard to estimate those for gas cars because so much is kept secret, but they are substantial.)


context

My 2006 M3 had SMG and I didn't like it much. (Fine for highway driving, sucks for city street driving). Maybe the new one is better?

context

Like what? AAA has a definition involving budget, it doesn't just mean "not indie". I am not aware of anyone operating at that budget level and using C#.

context

AAA studios may use C# for tools, but exactly zero of them use C# for their actual game engine.

context

Don't forget the cost in electricity of finding deposits, mining them, transporting the pre-refined gas, transporting the post-refined gas, and lobbying Congress.

context

It is not a stunt. It is a direct counter to the anti-EV argument "Electricity is made from fossil fuels anyway, so by driving an EV you are just shifting the source of pollution to the power plant."

Tesla is attacking every single anti-EV argument in a very deliberate way and most of the attacks are strong successes.


context

For sure driving the car fast is one of the most fun things!

context

P.S. My car has about 18k miles on it, so I don't think I have a particularly youthful battery at this time!

context

Well, it is the speeding part that is probably the issue.

It is true that if you want to drive to Mountain View at 85mph you probably can't do two round trips, You could certainly still do one round-trip without thinking about it, especially if you charged in range mode before heading down there. Actually from your figure of 200mi range, maybe you didn't know about charging in range mode? That gives you about 240mi range (maybe a little less with an older battery).

The issue about driving speed is just that, as with any car, wind resistance applies dramatically more force to your car as speed increases. If you are driving 60 (which is at or above the speed limit on 101 and 280), it is not bad since you are about at the efficiency the car's published numbers are based on, but if you are driving 70-75 range is very much decreased.

Here is a range vs. speed graph for the Roadster:

http://www.greencarreports.com/news/1019610_2009-tesla-roads...

(There are more-detailed charts on the internet if you search for them). Notice that if you are driving 55, you get about 70 more miles of range than if driving 75!

On the other hand if you want to drive 45 and feel like a total ass then the car will go really far!

So this is definitely one thing that will improve as electric cars and infrastructure get better: less dependency on speed or weather conditions (driving through heavy rain will also decrease range by quite a bit). In a Model S with the supercharger network already in place, you could probably drive from SF to LA like a bat out of hell and not worry about it, which is cool.

But, I am just saying, I drive round-trip to the south bay all the time and do not think twice about it. That is the kind of range that is trivial for the Roadster. Unless you commute at 90mph and aren't using range mode and/or didn't charge the car all the way up.


context

I bought a bunch of shares at 17. Seems to have been a good choice. If I were a serious participant in stock gambling I would have bought many more...

When it comes to this Bloomberg report, I read it and think, this very small production delay, and the reasons behind it, are all signs of a smart company doing the right things. The delay seems to have very little impact on the long-term success of the company. What really matters there is just whether people want to buy the cars, which this news does not have much bearing on (except maybe that people are more likely to buy the cars if they are perceived as paragons of quality... like the public perceives iPhones, etc).

The supercharger announcement was way above and beyond anything I expected. I own a Roadster and now I want a Model S because, as someone living in California, it fixes the one substantial issue with the Roadster: inconvenience of long road trips. The existence of the superchargers is way more of an upside than delayed production is a downside.... yet the stock goes down.

So, yeah... this looks like full-on market irrationality, just people being spooked.


context

Did you read the essay? Actually, both of them? The reason mathematics is considered Unreasonably Effective is because we have never seen anything with similar effectiveness. Ever.

context

Is hooking up your toilet considered technology? This kind of plumbing used to be super-high-tech at one time in human history. I think you would find few people who think of it that way now, though.

Hey, make your own choice about what you want to do with your life and what you want to consider cool.


context

If you want to make web games, make web games. That's fine. Just don't be deluded that this is "working in tech". It is not.

context

How many of these games have you actually played? It sounds like you don't know what you are talking about.

context

I am looking to move my company's office and wanted a good way to search for places, so I went to 42Floors's site this morning. I found a couple of reasonable offices on the map, clicked on one, filled out their few things the site wanted, and then the page said "a representative will contact you shortly" and I had no further way to take any more actions to actually find or view any offices. Now it's 4:15pm on a Friday and they still haven't gotten to me, but I can just go to Craigslist or wherever and email people straight away and go look at offices with very little delay.

In short, I can't tell what value the site is supposed to offer (it certainly doesn't seem to be helping me find an office).

So I think they have bigger problems than the way they open their blog postings. I would say that shit isn't real enough yet...


context

Why is this here? How do I downvote the post or recommend it for the "stupid twitter he-said-she-said" category?

context

Sure. But if you expect that everyone should engage in mediocre speech, just because you don't know whether you should respect them or not, the result is kind of a sucky world.

context

Oh wait, now it lets me reply here. See above.

context

I am referring to Justice Potter Stewart's famous 1964 opinion statement in a Supreme Court case on pornography. Look up "I know it when I see it" on Wikipedia.

context

It's like pornography...

context

My feedback is, please stop doing junk like this and go pursue some idea that has some kind of social value.

context

Google bought Twitch. They are not an independent entity. This is Google doing this.

context

I am sorry but I think you are posting in the wrong thread. We are talking about Christopher Nolan here.

context

Just like a C++ program would, you have a loop that blocks on network input. This has been solved since the 1970s.

context

Agree 100%. Not much more to say about this.

context

If they would stop announcing things that suck, we would have more faith that the next thing they announce will not suck.

It is just straightforward extrapolation.


context

This was definitely a letter for the Pinks. I found it hard to take seriously.

context

Unimpressed by this response. This is the same kind of thing said by everyone who makes bad products.

context

That sentence is not informative. The backbone of the regular internet is fiber optics as well (usually).

context

Everybody who has tried to use const in even medium-sized C++ codebases is familiar with the behavior discussed here. It's not about specific examples, it is about what happens when you have a lot of code.

context

Whether they don't right this second doesn't matter. Their terms of service say they can. If they decided it was unthinkable that they would ever do this, they could have written their TOS to be less overreaching. But they didn't do that. Therefore they think it's a possibility (if in fact they are not already doing it. Are you so sure? How do you know?)

context

So don't do that. Make a few UIs. They are a big company, they can do that if they actually have anyone who knows how to program.

context

Perfect point! The fact that you think that was the outcome of the investigation is due to the huge amount of money Toyota spent denying the problem.

context

You might want to read ourincrediblejourney.tumblr.com.

Usually this kind of announcement is followed by another one, 9 months-2 years later, of the service being shut down.

This is just the first step of that pattern again.


context

Please make an effort to read and understand my point.

Yeah, maybe there was a case when a couple peoples' lives were at stake but nothing happened.

The real issue is that tens of thousands or hundreds of thousands of peoples' lives are at stake RIGHT NOW, under conditions that are much less controlled than what people are deriding as uncontrolled conditions. But people are griping about the 1-2 instead of the 10,000-400,000.

How is this not dead simple to understand? I don't get it.


context

Look at what Toyota did with the whole unintentional acceleration thing. About as irresponsible as you can get.

context

They might be reasonable in isolation but are not being levied remotely in proportion.

I just realized what the problem is: this is bikeshedding. Everyone knows about people driving around and feels qualified to have moral indignation in that area, whereas few people know anything about actual cars.


context

This discussion is going insane.

I see lots of people arguing about the safety of how these guys conducted the hack. Okay, sure, there is probably an issue there of some degree.

But it's a very small issue compared to the fact that hundreds of thousands of vehicles are arbitrarily hackable right now, with more rolling off the assembly line all the time, and people are driving these around right now.

Why is most of the discussion here about the minor issue? Why is everyone so eager to derail discussion from the major issue? I thought HN was trying to be a reasonable place.


context

Bad calculation. One auto hacker can shut down all vulnerable cars simultaneously.

context

If it were not demonstrated under real conditions, the car companies would just say "this was a fake test not representative of real-world conditions, isn't that true Mr. Journalist?" and the journalist would have to admit that that was true and then they would say "Under real-world conditions our cars are safe; customers have nothing to worry about."

This has been their playbook about everything for a long time so I don't know why you think it would be different in this case.


context

Given that cars get recalled all the time for "this one part is kind of flimsy and might break 3% of the time", I am not sure why "some guy in China can drive your car off a cliff" is not grounds for an immediate and full recall.

If you talk to auto manufacturers in a way that they understand, they will understand.


context

I don't think you understand. Anyone in the world can do this. Right now. Any time.

context

I bet that if someone who knew what they were doing decided to optimize that, you'd get the cost WAY down, possibly almost to zero. (If you are using std::string, that is your problem right there).

But the very important difference here is that in your case you have a choice and it is possible to optimize the cost away and to otherwise control the characteristics of wheyou pay this cost. In GC systems it is never possible to do this completely. You can only sort of kind of try to prevent GC. It's not just a difference in magnitude, it's a categorical difference.


context

With capacitors you can also get the charge into the cells a lot faster, possibly. Also there are issues of energy density, overall weight, etc.

context

I recommend against using SSAO. It makes games look bad (and kind of cheap), because the results do not look much like the way light really behaves.

If you want your rendering to have nice lighting, there are all kinds of preprocessed global illumination techniques you can use. Yeah, these are not great for animated objects, and SSAO people will tell you SSAO is good for animated objects ... except it isn't, because again, the result does not look anything like actual light and shadow (I don't consider speed to be a great virtue if the thing that you produced quickly is not very good... Unless there is no way to do anything better because you can't affrod it.) There are other techniques you can apply in the case of animated objects to produce much better output (baking a per-vertex occlusion representation and evaluating it on the GPU, for example.)

There was a brief window in time when SSAO maybe seemed like a good idea but we are well past that.

The reason I say SSAO makes games look "kind of cheap" is because it usually gets used by Unity games that just turn on the SSAO flag. These games are instantly recognizable.


context

They have been hiring way too fast for way too long. The beginning of the end is years ago, at least.

context

I drive an expensive car. I've also put 40x the cost of the car into a company in order to pay people their salaries. The expensive car barely registers as noise on this budget. If I had to lay people off, that would still be true.

context

As recently as four years ago, the rhetoric was "there's no infrastructure for electric cars, building one is totally infeasible, therefore elextric cars will never work."

Now the rhetoric is "waaah, the charging stations are kind of inconvenient."

Anyone who writes this kind of article without looking at dsituation/dt is being dumb.


context

Graph does not appear to be inflation-adjusted.

context

From a newly-created account, no less... sigh.

context

Did you miss the fact that they are not only a car company any more?

context

Yes, it's a tough life, but people actually pick that life because other options are tougher (for example being a rural farmer somewhere).

It is true that we should care about conditions in factories such as these and work to improve them. It's also true that if you insisted that Chinese workers get paid what American workers would, there would have been almost no factories and the entire country would still be in desolate poverty. Yes if there were no factories then probably the environment would be in a better condition.

It's a very complex situation. Just picking one angle to it and insisting that angle holds the whole truth does not help anyone.


context

If you ever run a company you'll know that job creators exist.

I had a bunch of money. I could have kept it locked in a box somewhere. Instead I decided to pay a bunch of people to help make something cool. Yes, that's a transaction, but for my part, I didn't have to engage in it. (In fact some days I kind of regret it, given all the garbage one has to put up with when running a company.) I could have kept the money locked in a box and felt happy that I feel rich. Or something.


context

And this leads to code that is either too complex, too inefficient, or both.

I mean it's fine if you are competing with Python or something ,but if you actually care about perf this is never going to work. But of course anyone wanting to coin a "Rule of X" doesn't want to see a problem in its full context, they just want to be able to pretend to have a solution well enough that they can at least fool themselves. Etc, etc.


context

All AI articles are equally vacuous. I have not bothered to read them for at least 10 years now.

context

This has actually been in Microsoft's C compiler for along time, too. (The one that is mostly only C89). It is an embarrassing situation really.

context

It is a reasonable point that organization is important but I have to disagree about "most important".

The most important skill in software development, by far, is managing your own psychology.


context

Any reply like this, that does not consider the differences between recorded music and photography of human subjects, is neither thoughtful nor genuine.

There is some kind of a point here, but it's ruined by the author's lack of perspective and desire just to land a gotcha.


context

I disagree completely. I am not a physicist but I make video games which has the same kind of constant grounding in applicability. (We are always dealing with running physical systems, it's just that they are simulated.)

The week I learned to treat vectors as abstract objects, rather than arrays of coordinates, I experienced a drastic phase shift in my ability to program geometric operations effectively and clearly. The coordinates are still there, of course, but you have a lot more power over them.

The book "Linear Algebra Done Right" is all about this, and I absolutely recommend reading it if you haven't.


context

From my perspective, a valuable service that Uber provides me is to get rid of the hassle and annoyance of tipping.

Of course, for this to be done well they need to pay the drivers enough that tipping is not necessary.


context

FizzBuzz is not about the modulo operator. You can write it pretty easily without it, in several different ways. If you can't figure out how pretty quickly, you're not a good programmer. Sorry.

It is even kind of reasonable to just substitute a call to some black-box is_divisible_by() if you can't figure out how to do that test ...


context

In many performance-programming situations you are subject to constraints that prevent use of a general solution or else makes use of a general solution massively inefficient.

For example, your data structure is on the GPU and your data is in a texture in a certain specific format because of other reasons.

If you wrote the above reply without considering this kind of case, it probably means you haven't been exposed to very much of this kind of case ... ... which was my original point.


context

I have no idea what the job was.

But my point is this is not a bureaucratic gotcha question. If you can't do this task, you don't really know how to program well. Sorry but that's just how it is. It's like failing FizzBuzz.

There is this culture of crappy software that has happened lately, especially in the Web world, and it is really quite lamentable. I believe that a very large positive impact would be made on the world -- due to the extreme prevalence of software these days -- if more people would take seriously the idea of software creation as a craft with a very high skill ceiling, and work diligently to improve their understanding and their skills.


context

Maybe you're just not doing serious programming. Most people I know implement data structure searches quite often.

If you're writing scripts, or JS code for web pages or something like that, then maybe you don't use CS stuff, but ... are you able to write a web browser if you had to? Are you able to write an operating system or navigational software for a spacecraft? If not, then maybe just see this as revealing sectors of your skill set that could be beefed up, rather than presuming that none of that stuff is important.


context

I am dismayed by the way all the reactions on Twitter are piling on with outrage and/or relating similar experiences.

Inverting a binary tree is pretty easy. It is not quite as trivial as FizzBuzz, but it is something any programmer should be able to do. If you can't do it, you probably don't understand recursion, which is a very basic programming concept.

This isn't one of those much-maligned trick interview questions. This is exactly the kind of problem one may have to solve when writing real software, and though you may never have to do this specific thing, it is very related to a lot of other things you might do.

I run a small software company and I very likely would not hire a programmer who was not able to step through this problem and express a pseudocode solution on a whiteboard.


context

I flagged your comment and downvoted you. If a Wall Street worker feels unduly stressed, that person can easily quit and work a nicer job; it is only greed holding them there. A poor person working multiple part-time jobs probably hasn't much of a choice. So whereas sure, we can empathize with everyone, the problems of one group are much less real than the problems of the other.

context

This is not at all true. For example, in new age communities, "triggered" is used very frequently to mean that someone's personal ego wires have been tripped -- and in fact the reason the word is used is because it's understood that "being triggered" is not that big a deal, it's part of being human, and one may get very upset in the moment but with perspective one sees it's fine.

This is obviously very different from the PTSD-only meaning you are talking about, but I would bet that the new age version of the word has been in use for much longer and by many more people.

So ... I sense some presumption and lack of exposure here, is all I am saying.


context

As someone who runs a small company, I agree with this 100%. You don't want to be that guy.

context

This is a waste of time. I am out of here.

context

(And really my point is that I perceive there is an ambient pressure toward copying function parameters in general in order to minimize refactoring ... which is what I mean by there being an overall performance impact).

context

Well, keeping in mind that I don't know much of the specifics of Rust, and am just making a guess at what it's like to use, this is what I mean:

Actually predicting where data is really going to go involves solving the halting problem. So by necessity any static analysis of ownership is going to be conservative, in the sense that it has to err on the side of safety.

So there's a process of structuring things so that it's not just the programmer who understands, but the compiler who understands. Structuring the code in alternative ways so that ownership is made clear and/or ambiguous cases are resolved. Sometimes this could be a small amount of work, but sometimes it could be a very large amount of work (analogous to the simpler situation in C++ where you are using const everywhere but need to change something deep down in the call tree and now either everyone has to lose their consts or you have to do unsavory things).

At points, it might be possible to structure things so that the compiler would understand them and let you do it, but it would take a large amount of refactoring that one doesn't want to do right now (especially if one has had a few experiences of starting that refactor and having it fail), so instead one might punt and just say "this parameter is owned, problem solved". And that's great, you can stop refactoring, but you just took a performance hit.

Now, in some cases it is probably the case that this is in reality an ambiguous and dangerous ownership situation and the language just did you a favor. But there are also going to be cases where it's not a favor, it's just the understanding of ownership being conservative (because it has to be), and therefore diverging from reality. But I want to get work done today so I make the parameter owned, now the compiler is happy, but there is a performance cost there. If I were not so eager on getting work done today, I might be able to avoid this performance hit by wrestling with the factoring. But I might deem the cost of that prohibitive.

That's all I mean. But like I said, I have never written a large program in Rust so I am not speaking from experience.


context

I see. Having to say .into() makes me feel a little better about it. But it does make it clear there is a runtime performance cost to insisting on a strict ownership model.

context

Yeah, and this is one of the many reasons why performance programmers usually do not touch std::string.

context

I am talking about the actual conversion of an unowned object to an owned object, which as nearly as I can tell involves copying the object? Implicitly? All the time?

context

This way of doing things sure sounds like it has massive performance implications.

context

Going further, why doesn't your shell show you while you're typing? I have seen one or two application programs that do this and it is VERY helpful.

context

You could do that, but then you aren't storing events any more, you're storing world state again. (That is what it means to make events reversible .. or at least that is the straightforward way I would think to do it.)

context

In most worlds, the oldest state just gets thrown out; you keep playing as usual, it's just that if you were then to try and rewind all the way to the beginning, there would be a limit on how far you can go.

In world 4, though, where you can walk to the beginning of time just by walking to the leftmost part of the level, I actually kick the player out of the level when there's no more memory. I have never heard of anyone noticing this.


context

There is a system of full-frames and deltas, like video encoding. Every frame gets saved and reproduced exactly. There is no interpolation.

context

Exactly: I record the positions of all objects every frame (and other state variables), not the inputs that generated them.

It is more expensive in terms of the amount of memory required, but it is much less expensive in terms of the amount of CPU required, and CPU was ultimately the biggest problem, so it seems I made the right decisions. Even on a limited-memory console like the Xbox 360 you can rewind most levels for 30-45 minutes before running out of buffer. That is more than anyone ever wants to do as a practical gameplay interaction.

Working on The Witness... it will be done ... someday not too long from now.


context

In most real physics systems, you have to solve systems of simultaneous constraints in order to get the right answer, i.e. all the guys involved in an interaction are sort of going at the same time. (Though also these systems do tend to have ways to break constraints and subdivide the problem when it gets too big, which then does give preferential movement, but not in a way that any kind of functional programming would solve, because it is inherent in reducing the complexity of the math problem.)

There may be exceptions to this; it is not a subject I keep current with.


context

Indeed, this is a relatively trivial problem. There are reasons you might want to do something like this (like if you are trying to parallelize your sim) but I'd slot that under "open research problem that nobody agrees on".

There are a lot of problems with global state in games, but entity values are not one of them, mostly. Problems do crop up but you just deal when they do.


context

Not true, mostly. I recorded world state, not events. The one exception is world 5 where I record events so that your shadow-universe guy can do things.

Event recording has a fair bit of history in games, especially as a debugging technique, but I did not want to use it for rewind, considering it too fragile and annoying, and probably too expensive and complicated (you would have had to store world state anyway, to have something nearby to delta from so that you don't start from the beginning of time every frame, so now you have TWO systems: world state recording and event recording. Better to stick with one.)


context

That is not remotely the whole argument. The argument is really about the fact that competition ultimately drives profit margin to zero. See his other writings for details.

context

The problems described are accurate. From a modern standpoint, C and C++ are terrible. The only problem is that most new languages are more terrible along other axes, making them unusable for what C and C++ get used for.

The author is accurate that Rust may be a substantial improvement.

I don't necessarily want to program in Rust, which is why I am designing my own alternative, but I think if you are building safety-critical software than something Rust-like is a pretty good idea.

For my own experiments I am going more in the direction of giving the programmer auditing power, rather than just having the programmer have to silently worry about what code may or may not be doing.


context

This article seems goofy and weird. He spends a LOT of time randomly talking, in order to justify not using a profiler, when profiling is such a simple and easy thing.

I know many high-performance programmers and all of them profile because profiling is how you test your mental model against reality. Yes, as the author says, having a mental model of machine performance is important. But you need to test that against reality or you are guaranteed to be surprised in a big way, eventually.

Example: How does he even know that his div optimization matters? If he is even reading through one pointer in that time, he is probably taking a cache miss on that read, the latency of which is going to completely hide an integer divide. The author seems generally to not understand this, since he spends most of his time talking about instruction counts. Performance on modern processors is mostly determined by memory patterns, and you can have all kinds of extra instructions in there and they mostly don't matter.

Which this guy would know if he profiled his code.


context

Yeah, umm, especially in high-end games, the idea of every object "knowing how to draw itself", in the stereotypical OO way, is completely absurd.

context

"This solution isn't perfect therefore we shouldn't do it"?

context

Because now your ability to install your mission-critical software is dependant upon https://whatever.io actually being up. Which it certainly won't be forever.

Or, you know, maybe someone updated the whatever.io installer to make it 'better'. But you are trying to debug some problem and you made one image last month and another one this month and you're pulling your hair out trying to figure out why they are different. Oh, it's because some text changed on some web site somewhere.

You've taken a mandatory step and put it outside your sphere of control.


context

The article is not good.

It is true that games tend to be overly simplistic in this way, but it is really a design problem, not a technical problem.

Technically, we can litter the ground with stuff and have plenty of permanent-world changes. Design-wise, it is usually unclear how to make that a playable game. Using someone's middleware is not going to help this.

The premise of this article is another example of what Frank Lantz calls the Immersive Fallacy: https://www.youtube.com/watch?v=6JzNt1bSk_U


context

It's about how much friction is imposed on what. As anyone designing a "funnel" for web purchasing (or whatever) will tell you, a little friction goes a LONG way.

Yes, of course you can go out of your way to get connected to the people making a game. But it's just harder to do that on iOS than on Windows, and this has consequences in terms of the viability of these platforms for small developers. (It is by no means the only factor. The race-to-zero pricing on iOS is probably a bigger factor.)


context

It's not about giving your email address to everyone (I wouldn't want that either), but it's about having the friction-free option of doing so. There are a number of game-makers for whom I absolutely want to be emailed when they make something new, because I love their stuff. That doesn't happen on iOS very easily.

context

You are saying "anti-marketing" but what this really means is that they are putting less investment into production values than you prefer.

The whole point is that it is doubtful that they will stay solvent if they put more money into production values on that platform.

Nice graphics are very expensive. (They are more expensive than any other aspect of game development, in fact). If it seems unlikely that enough people will buy their game given that type of investment, then it may be a good choice to stay away from that.

On top of which, maybe they just don't want to spend all their time doing graphics. Maybe they want to work on the story / world / etc.


context

There are massive differences. For example, in the Windows world you are allowed to find out who likes your stuff and build relationships with them.

context

Why is (2) dependent on anything regarding the number of samples you get at once? Sure, suppose there is a maximum block size; why does anything regarding copying "an entire something" require you to have filled that entire block size with live data? Why can't you just copy however much is available in the buffer?

I don't understand why copies are even relevant: you can make several extra copies and nobody will ever notice. Audio data is trivial in modern systems. Let's say there are two channels coming in; 48000 * 2 * 2 bytes per second is an absolutely trivial amount of data to copy and has been for many years. Building some convoluted (and unreliable) system just to prevent one copy per application, when each application is going to be doing a lot of nontrivial processing on that data, strikes me as foolish. But don't listen to me, look at the fact that Linux audio is still famously unreliable. If the way it's done were a good idea, it would actually work and everyone would be happy with it.


context

Well, it depends on how that specific hardware is designed, but we could say that hardware that is designed to generate only fixed blocks of audio is very poor from a latency perspective.

I think you will find, though, that most hardware isn't this way, and to the extent this problem exists, it is usually an API or driver model problem.

If you're talking about a sound card for a PC, probably it is filling a ring buffer and it's the operating system (or application)'s job to DMA the samples before the ring buffer fills up, but how many samples is dependent upon when you do the transfer. But the hardware side of things is not something I know much about.

> If you are writing some sort of synth, as soon as you receive a midi note or a tap, trigger the synth and the note will play in the next audio block

Yeah, and waiting for "the next audio block" to start is additional latency that you shouldn't have to suffer.

> If you are doing some sort of effect, grab the input data, process and have it ready for the next block out. I don't understand why you need a second loop.

The block of audio data you are postulating is the result of one of the loops: the loop in the audio driver that fills the block and then issues the block to user level when the block is full. My whole point is you almost never want to do it that way.


context

I am not sure we can make the assumption that the input and output devices are on the same clocks or run at the same rates. Maybe they are (in a good system you'd hope they would be), but I can think of a lot of cases where that wouldn't be true.

However, even when they are synced, you can still easily see the problem. The software is never going to be able to do its job in zero time, so we always take a delay of at least one buffer-size in the software. If the software is good and amazing (and does not use a garbage collector, for example) we will take only one delay between input and output. So our latency is directly proportional to the buffer size: smaller buffer, less latency. (That delay is actually at least 3x the duration represented by the buffer size, because you have to fill the input buffer, take your 1-buffer's-worth-of-time delay in the software, then fill the output buffer).

So in this specific case you might tend toward an architecture where samples get pushed to the software and the software just acts as an event handler for the samples. That's fine, except if the software also needs to do graphics or complex simulation, that event-handler model falls apart really quickly and it is just better to do it the other way. (If you are not doing complex simulation, maybe your audio happens in one thread and the main program that is doing rendering, etc just pokes occasional control values into that thread as the user presses keys. If you are doing complex simulation like a game, VR, etc, then whatever is producing your audio has to have a much more thorough conversation with the state held by the main thread.)

If you want to tend toward a buffered-chunk-of-samples-architecture, for some particular problem set that may make sense, but it also becomes obvious that you want that size to be very small. Not, for example, 480 samples. (A 10-millisecond buffer in the case discussed above implies at least a 30-millisecond latency).


context

I understand that people are downvoting this because it is just a negative comment, or something. But, I felt it was VERY important to call out information that is clearly false. Someone who doesn't know about audio programming might read the above post and think "hey that sounds plausible, I learned something today" when in fact they were deeply misled. Registering dissent is important and I tried not to be rude about it. I did go on to give a sketch of reasons in the thread below (but it is a complex issue with a lot of details; exact situations differ on every platform; etc, etc.)

context

If you want a further analogy, it's like public transit. Which is a better commute: You take Bus A, which then drops you off at the stop for Bus B, at which you have to wait a varying and indeterminate amount of time, because the schedules for Bus A and Bus B are not synchronized; or just taking Bus C, that travels the same route without stopping?

context

I could type up a thorough explanation, but it would take about an hour, and I have a lot to do. It is actually not a bad idea to do such a write-up, but I don't think the appropriate venue for it is an ephemeral post on Hacker News ... I'd rather blog it somewhere that's more suitable for long-term reference.

But I'll drop a few hints. First of all, nobody is talking about running interrupts at 48kHz. That is complete nonsense.

The central problem to solve is that you have two loops running and they need to be coordinated: the hardware is running in a loop generating samples, and the software is running in a (much more complicated) loop consuming samples. The question is how to coordinate the passing of data between these with minimal latency and maximum flexibility.

If you force things to fill fixed-size buffers before letting the software see them (say, 480 samples or whatever), then it is easy to see problems with latency and variance: simply look at a software loop with some ideal fixed frame time T and look at what happens when T is not 100Hz. (Let's say it is a hard 60Hz, such as on a current game console). See what happens in terms of latency and variance when the hardware is passing you packets every 10ms and you are asking for them every 16.7ms.

The key is to remove one of these fixed frequencies so that you don't have this problem. Since the one coming from the hardware is completely fictitious, that is the one to remove. Instead of pushing data to the software every 10ms, you let the software pull data at whatever rate it is ready to handle that data, thus giving you a system with only one coarse-grained component, which minimizes latency.

You are not running interrupts at 48kHz or ten billion terahertz, you are running them exactly when the application needs them, which in this case is 16.7ms (but might be 8.3ms or 10ms or a variable frame rate).

You don't have to recompute any of the filters in your front-end software based on changing amounts of data coming in from the driver. The very suggestion is nonsense; if you are doing that, it is a clear sign that your audio processing is terrible because there is a dependency between chunk size and output data. It should be obvious that your output should be a function of the input waveform only. To achieve this, you just save up old samples after you have played them, and run your filter over those plus the new samples. None of this has anything to do with what comes in from the driver when and how big.

Edit: I should point out, by the way, that this extends to purely software-interface issues. Any audio issue where the paradigm is "give the API a callback and it will get called once in a while with samples" is terrible for multiple reasons, at least one of which is explained above. I talked to the SDL guys about this and to their credit they saw the problem immediately and SDL2 now has an application-pull way to get samples (I don't know how well it is supported on various platforms, or whether it is just a wrapper over the thread thing though, which would be Not Very Good.)


context

As someone who has actually done a good amount of soft-real-time audio programming, I can tell that you probably haven't. Everything you are saying about CPU speeds is made-up nonsense. Look into how these things are done on systems where folks actually care about latency (for example, commercial audio hardware, game consoles, etc).

context

But ... why is there a period size? Isn't that a broken design that can only introduce latency? What is wrong with "however much audio data is ready when the application asks, send it"?

context

If you actually read the words in the article you will see that that test was the absolute best case they could find.

context

This does not read to me like good advice from an experienced programmer.

context

Nope. Nope nope. This is the same argument that early OO people used to justify the idea that you should use getter/setters everywhere rather than directly accessing your variables. You never know if the implementations of those ideas will change!!!!!11

That's programming. It is always possible that anything might need to change. That is how it is. This does not justify calcifying your code by adding extra unnecessary structure, because what that in fact does is make the program harder to change later (while requiring you to do more work up front). Also, as the author of the article notes, it requires one to keep more pieces of information clear in one's head in order to work with code of equivalent complexity, something that is almost always a big lose.

In a good language, if a dependency implementation changes, you know this because your program does not compile. (Well, of course because you are not a noob, you are linking things that are versioned in the first place, so this should not ever even be an issue unless you are actively upgrading outside code and are expecting it.) When your program does not compile, you want the compile error to be at the site that uses the dependency, because that tells you exactly where the thing is that you need to fix. Adding excess verbiage around it, and distancing the site that instantiates the dependency from the site that uses it, only causes more work.

If you are using a language/system that doesn't allow you to program this directly and clearly, then maybe that is the problem...


context

Yeah. I only heard about "dependency injection" a few months ago, and my reaction was that my brain just didn't get it, because, like, why are you making a huge deal about such a simple thing? If we made this much of a big deal out of every idea in programming, we would never be able to get anything done.

Since then I keep hearing about "Dependency Injection" so my impression is that it's gaining in popularity. But my kneejerk reaction is always that if someone is talking about this subject, they probably are not a very good programmer, just like if someone is talking about how important UML diagrams are. It is maybe a hasty conclusion but that is where my brain goes.


context

Before posting a question like this, please bother to look into the logistics involving the company (for example, their actual plans for scaling). They are a new car company. They do not yet have the capacity to manufacture at high scale, but they are working on it. Whether or not they hit their plans, 40k cars/year is well beyond their stated expectations for the year 2015.

I mean, are you supposed to be able to just wake up in the morning and say "hi, I am a car company" and manufacture 50k cars?


context

Yes, everyone moved every frame. If I remember correctly, they would get linearly extrapolated until next time the update for that guy ran. I think there was also an option to do the terrain interaction every frame, to avoid any perception of guys sinking into the terrain slightly.

context

No, it's not really different from a hard-sphere type interaction. It's the same thing. But that doesn't matter in this context, because in the system we built, our problem was harder/more-general because we were simulating 'guys', meaning you can program an arbitrary interaction into your guys (is guy type X near guy type Y, if so what happens?), and those 'guys' also interacted with a 2D height field terrain.

Keep in mind this was all on computers from the year 2002, which were pretty damn slow compared to computers today. Today you could do a lot of guys.


context

No offense, but you are living in a kind of unfortunate corner of "industry".

Where I am sitting, you absolutely have 10 years (or more) to give toward a specific cause, and the path of the software developer is one of lifelong improvement.

Stuff like "the rate at which tools are changing" doesn't matter too much, because that stuff is just surface-level knowledge, not deep knowledge.

I am 43, and have much to do yet; if you are telling me I am due for retirement, I suggest you have a very warped view of the world.


context

Not really. Back in 2002 we did the first game jam where the engine concept was "100,000 guys". That was 13 years ago:

http://www.indiegamejam.com/igj0/

Those guys all interacted with each other and the environment (though the engine was designed to do the interactions in slices, where 1/N of the guys would be checked each frame, but N was not high, like 4 or 5 maybe?)


context

And Sony's new VR headset runs at 120Hz. To anyone who thinks 10ms is not noticeable, I encourage you to do the math.

context

San Francisco, both AT&T and Verizon. It's terrible here.

context

I don't even get 4G speeds almost ever. Bandwidth is always massively oversold. Often I can't even successfully load a random web page when I have 4+ bars of "4G".

context

I am not sure I believe this. What studios have you worked for that found lack of source okay?

Basically anyone making interesting games needs source, because at some point you want to do things that the engine didn't exactly anticipate.

Also, if Unity really were "an efficiency avalanche", we'd be seeing a lot of high-polish games in Unity. To date we haven't really.


context

Proposal made without any thought as to how scummy app developers are going to game the system.

First thing that happens is opening a new development account per app in an attempt to squeak under the $100k as much as possible.


context

I am not sure why Linux people tend to think this is bad. It is how you ship robust software that doesn't break. Linux "solves" this issue by having all kinds of things break all the time and just accepting that breakage and saying "no really things usually work fine".

context

Kernels tend to want to be very specific about how memory is being used. GCs cause the opposite situation.

i.e. these "specific requirements and limitations" you mention have a LOT to do with memory.


context

Look, what you are saying just doesn't work. What happens when the pointers are in registers? What happens when the loop is occurring in a thread running on another core?

Yes, you can make GC work in these situations, but you are going to pay for it. In perf.

I have to say frankly I do not believe any of the words in your last paragraph at all.


context

"any language should be able to generate the exact same optimized assembly as a manual C++ vector iteration"

This is absolutely, massively untrue. If you try making compilers sometimes, you will see how very hard it is for compilers to be sure about anything.

For example: Are you calling a function anywhere inside that iteration? Is this a copying collector? Could anything that function does (or anyone it calls) possibly cause a GC, or cause us to be confused enough that we can't tell whether a GC might happen or not? Then you need read barriers on all your operations in this function, i.e. your iteration is going to be slow.

"Also, if your data to work on is being streamed in, having to make the choices in managing the allocations & uses of std::vector buffers is much less useful than having the system heuristically balance in a more managed environment."

Also absolutely, massively untrue. Your application knows more about its use cases than the generic systems it is built on (which must handle many many different kinds of programs). Because your application knows what is meant to happen, it can make much better performance decisions.


context

I have great respect for anyone who undergoes a big project. So I am sad that this project is coming to an end, and I hope his future endeavors go well.

But.

From my perspective as someone who keeps going back to Linux and trying to use it every 18 months or so, the #1 problem today is that there are WAY too many distros -- and as a result, all of them are broken. What really needs to happen is for the Linux community to put a great deal of elbow grease into a small number of distros.

Because I only try Linux every year or two (and give up on it every time), I see isolated snapshots of how usable the OS is, and from my perspective, it's gotten less stable and less usable over the past 5 years. (Six months ago I had to try 4 different distros before one would even install correctly on one of my two test laptops, for example).

In terms of mainstream distros that are actively trying to appeal to end-users (not counting fringe research projects), how many is enough to provide good variety? I am thinking 3-5 maybe?

Instead, this is the situation: http://en.wikipedia.org/wiki/List_of_Linux_distributions

Does anyone think that is an efficient way to produce quality results?

Edit: It's also worth keeping in mind that the Wikipedia list is sort of the minimal list of versions. For example, if you go to the Linux Mint homepage, you get 4 different versions to choose from: http://www.linuxmint.com/


context

My point is that you can solve this one symptom but your program will have many other problems due to GC (provided it does a lot of work). It is like whack-a-mole in that there are always more moles.

context

It is only an implementation issue if it is possible to solve the issue.

Nobody has ever built a garbage collector that does not slow your program down or cause it to use vastly more resources than it would otherwise. (Claims to the contrary are always implicitly caveated).

Given that this is the case, it really does start looking like a language issue. Yes you can rearchitect the GC to care more about locality, but you are just pushing the dust around on the floor: you will find a different problem.


context

Show me your commits where you've fixed lots of driver bugs. You sound like you have no idea what you're talking about.

context

Which is why people who are serious about memory write their own allocators (or link preferred allocators with known behavior). It is an extremely common thing.

context

It's bad.

The reason is because you don't control the GC and don't even necessarily know what exactly drives the decisions it makes. So once you want to go beyond a certain level of performance, there is no right answer. You are just randomly trying stuff and kind of flailing.

In C++ (or another direct-memory language), there is a right answer. You can always make the memory do exactly what you want it to, and there's always a clear path to get there from wherever you are.


context

No, no it isn't.

There is always some garbage velocity beyond which any given system is not able to cope.

Usually that limit is kinda small compared to what you'd actually like your program to be able to do.


context

When I think of what I want in an operating system, "I wish it would lock up occasionally while everything garbage collects" is not high on the list.

context

You can't schedule something unknown.

context

Steam doesn't deal with its own files. It deals mainly with random games that are creating tons of random files.

context

See this graph:

http://www.extremetech.com/wp-content/uploads/2013/08/CPU-Sc...

The green line was the only one still going, and it plateaued about a year ago (you'd see that in a newer graph).


context

Math is supposed to be about pursuit of truth, not how congenial you are.

context

Yeah. Requirement #1 for any test like this is, it happens in a Faraday cage.

context

Do you think they hire boring people to be reps? If you were in charge of hiring the most effective people for this job, who would you hire?

context

Like I said, that is a problem. So start thinking about how to solve it ....

Like, the plane is always landed from the ground, not the air. That idea introduces new problems, but you can start thinking about those too. Come on, people.


context

You can make a similar list about anything that doesn't exist yet. If it were easy, it probably would have been done.

Eight years ago, you could have made an equivalent list about electric cars. Well, we have electric cars now and that situation is looking pretty good.

Imagine you are an Elon-Musk-alike who wants to make fast air travel happen. Then this article isn't a list of why it's impossible, it is a list of problems you need to solve in order to make it work. I think we have enough examples in recent years to show that if someone with sufficient inventiveness attacks the problem hard, many of these kinds of things really are solvable.


context

If the new tools and techniques were actually good, they would be enough for a while, and people wouldn't feel the need for even newer tools and techniques the next week.

This should be an obvious and automatic hype-temperer, but for some reason it isn't.


context

Author has no concept of what it is like to take submissions to anything. As someone who is a member of an investment fund that gets way less attention than a top-tier VC, I'll tell you what it is like to have open submissions:

Almost everything that comes through that channel is complete garbage. It has negative utility to read through that stuff because it makes you tired enough that you might actually miss something good if you see it anyway, and it predisposes you toward negativity regarding submissions (which is psychologically unhealthy both for your quality of life and your relationships with potential investees.)

Our hit rate from open submissions was 0.25%, that is, it took 400 submissions to get one company in which we would invest. And that company is one that likely would have come to us through a more-closed submissions process.

One of the most valuable things you can have as a business owner (startup or no) is an understanding of context. Know what the situation is like for the people you are dealing with. Know why they do things the way they do. This author has not built that experience/skill. He is only seeing things from his viewpoint as someone who wants money. Guess what, this makes him isomorphic to every other random startup founder in which a VC is not going to invest.

Use common sense: VCs are financially motivated to find good companies to invest in! If they think something will give them an edge, they are going to try it. The fact that they don't take open submissions should tell you something about the dynamics of the system. You should listen and understand what that something is, because that understanding is valuable.


context

Is there an underlying presumption that smart hacker-type people do not have spiritual interests?

I think a lot of smart hacker-type people would disagree with that.


context

Usually when people say stuff like this, they haven't programmed anything as complex and performant as the thing they are criticizing, so the comments can and should be disregarded as noise.

context

Very misleading title. This is not about a GC speed up, it is about a reduction in pause time. The GC a itself is eating as many cycles as before (possibly more due to the cost of making things incremental) and they are clear about this in the article.

context

Random mom-and-pop folks have been able to buy high-DPI laptops at Best Buy for well over a year. (And higher DPI than discussed in this article; for example, 3200x1800).

In computer lifecycle terms that is a long time and it is a little bit embarrassing that Linux works as poorly as it does in these situations. (Windows ain't so great at it either, which reflects poorly on Microsoft, but it still handles the situation a lot better than Linux does.)


context

It is suspiciously posted by an account that was just created and is even named "throwaway". Downvotes are fully justified.

context

I should clarify before people respond antagonistically ... as someone who is designing a programming language, I think it's great any time someone is making languages and trying out new ideas. That part is great, if you want to make a functional imperative declarative loosely strongly typed language then hey, go for it.

But I think one has a responsibility not to try and sell one's project as something it's not, which especially means being careful with claims. I know this is sometimes hard because a lot of language design stuff comes from the academic community which notoriously overclaims (because it is their job to overclaim), and it is easy for that culture to rub off.

But if you are going to say something like "just by doing this our code becomes 200% better", with a straight face, about something that most practitioners know is going to be terrible in most cases without a tremendous amount of additional work and solving of unsolved problems (solution not shown), you're just telling the reader that they can't take you seriously. It's a bad thing to do.


context

It is cute, but you can't design a language for toy problems or you run into problems with bigger software.

I kind of stopped reading here:

"Since it is declarative code, update returns a new world w2 instead of merely modifying w1. The funny thing is, just by doing this our code becomes 200% better. For example, you can now modify the code to store all world states in a list!"

Uh huh. Try doing that with a nontrivial game that needs to be performant, and let me know how that works out.


context

It seems like it would be pretty easy to filter by amount of motion and list in priority order. (Provided the motion lasts long enough to overcome latency effects, etc).

context

Come on guys, think about it!

It is trivial to make your test routine log the error but return true so that the compiler doesn't stop.


context

The talk is just over an hour. The second hour is Q&A.

context

It is good that they are testing, but the mindset that this is a special occasion seems very weird to me.

Rule #1 of programming is that if you didn't test it, it doesn't work. (It may still not work for real after you test it, but at least it's got something.)

You can't claim to anyone, or even yourself, that you have some kind of fault-tolerant system if you don't do this kind of test after every change.


context

You never know, you might be talking to someone with actual experience.

context

It is nice to see someone summarizing this kind of information. However, really this is a continuation of the academic attitude toward parsers making them MUCH harder than they have to be.

If you want to study grammars in an abstract sense, then think of them this way, and that's fine. If you want to build a parser for a programming language, don't use any of this stuff. Just write code to parse the language in a straightforward way. You'll get a lot more done and the resulting system will be much nicer for you and your users.


context

Sync is not that hard. I have never had a problem with Dropbox, for example, and that was made by brogrammers.

context

I don't like X. I want to migrate to something simpler, but I think it should be MUCH simpler than anything anyone is building right now.

I actually did a few more tweets after this describing what I think makes sense.


context

The article isn't wrong, it's just ambiguously written. When you are viewing stereoscopic 3D, you are indeed looking at a surface that is flat and at a fixed distance from your eyes. This has consequences in terms of how your eyes are used to working versus how they have to work in a situation like this.

So you might have fun wondering how to build something that doesn't work like that.


context

Sadly the list is missing my favorite math book, "Linear Algebra Done Right" by Sheldon Axler. It's an absolute must if you are learning linear algebra and want a deep understanding of it.

Rather than make you grind through mechanical operations on matrices, as most books do, this book takes a coordinate-free algebraic approach and does an amazing number of things with it, cutting directly to the chief insights. There are ninja proofs in this book, 4 lines long, showing some deep and useful thing about linear algebra that other books would spend pages proving in a very verbose and uninsightful way.


context

Okay, but I never said anything about default, so I don't know what this has to do with anything! As you know, when the government prints more money, it dilutes the value of the existing money. So then, when you buy Treasurys and they get paid back, you made your money by silently leaching it out of the pockets of all other Americans, with the government as intermediary. What a great financial model!

context

I actually do have a 401k (as well as an additional retirement fund), but I only have those things because I have surplus money sitting around. The problem with something like a 401k is that there is a heavy opportunity cost: you don't get to use that money, ever, until you retire. If there were something productive you could have done with the money instead, that were still relatively safe, maybe you should have done that! (Especially in the current climate of seemingly-perpetually-low interest rates).

If creative endeavors are profitable, you can use the resulting money to fuel more creative endeavors, thus making the world a better place. Keeping money in a bank account or publicly-traded stock does not particularly make the world a better place.

Once I got approximately into the f-you money level of income, it became crystal clear how fictitious money is in the first place. I wake up one morning, and bam, I am wealthy! Why? Because someone said so and typed a number into a computer. Okay... that's kind of weird.

Given that money is so fictitious and somewhat meaningless, it is a shame to give into primal hoarding impulses, just so one can see the number in one's bank account go up like a high score in a video game. It's much better to make like Elon Musk and use your money for what it is: a way to wield influence to make the world more like you would like it to be.


context

Those are two very-selective samples. How much did the S&P return from 2000 to 2011? If you look at a chart of an index fund dating back to the 1970s, they certainly look like things that were following an upward trend, which got goosed steeper a couple of times, until everything blew up in 1998 and now there is chaos and unpredictability. 1998-2011 is over 1/3 of the period from 1971 to 2011 so one can't really regard this as a blip!

Yes, money "can" clearly be made investing, if you time the market and are lucky. I am disputing the idea that stocks will always generally go up. I think this used to be true but may have changed. Much like "housing prices always go up", which was shown to be absurd.


context

By the way, when buying TIPS (or any kind of Treasurys), you are basically saying "Yes, I like the fact that the USA is deep in debt and I want it to go more into debt rather than balancing its budget!" That is what Treasury bonds are -- the USA borrowing money from you and promising to pay it back. This is where the debt comes from!

Somehow I do not think many people realize this...


context

I go into more detail on this in my post below, but I think that this is actually not true any more, because the basic nature of the economy has changed. If you adjust for honest estimates of inflation, stocks have been down consistently for 15 years. That starts to look a lot like a "new normal". Then you start adding in survivorship bias, selection bias, etc; the picture is not pretty.

Maybe it's not the new normal, and maybe stock markets will start rising again as they have historically, but when you see a lull this long, it at least suggests the strong possibility of a pattern break.


context

I am kind of shocked by the uniformity of answers here, so I will add a dissenting voice.

In the current economic climate, it is pretty much a waste "investing" in anything until you have, say, an 8-figure sum in cash laying around doing nothing. I don't have that, so I am not bothering with "investing". I put "investing" in quotes because I feel the word tends to be perversely used; people really mean speculation, that is, gambling with negligible effects in terms of real-world wealth creation, but the gambling happens on such a huge scale that it distorts market prices hugely. Real investing is when you put money directly into something in order to enable the creation of something that wouldn't have been possible without your capital (as the YC folks do).

Stocks are terrible. If you look at market histories, corrected for inflation (actual inflation, not government-reported inflation, which is always understated, as the government benefits by understating it -- so normalize against something like an alternative inflation index or else straight-up commodities) then the S&P, DJIA, etc have actually not grown in 15 years. 15 years!! I know all of the "just buy an index fund" seems like good advice -- and it did used to be -- but in modern conditions that is no longer true. On top of this fact, pile on the risk of another market crash due to the USA's still-precarious economic situation, and stocks are clearly just not worth being in. (People are starting to realize this; there have been net outflows from equities most of the time for the past 40 weeks, and insider-selling-to-buying ratios are consistently huge.)

You can put money in bonds, but then it is locked up and you have a lot of inflation risk, so then you'd be aiming at short-term bonds, which are going to yield less.

Really what has happened is that US economic policy has become very hostile toward people who are responsible and save money, as an incidental effect of the desire to stimulate consumption (which mainly means taking on more debt and keeping rates tremendously low because if they ever become not-low now, debt burden is going to crush the economy.)

The upshot is that you are better off taking the mental energy you would have expended on "investing" and subsequently worrying about your money, and instead funneling it into your creative endeavors. You will make more money that way, especially when you take a long-term view. (Think about Einstein and the story about him having a closet of identical suits; except what I am talking about here is way less extreme and way more obvious.)

I have a rant about how peoples' "investing" according to the modern American model is actively making the world a much worse place than it ought to be, but this post is already long enough.


context

“California stop” is already a reserved term that means something else (slowing down mostly, but not all the way, at a stop sign).

context

Try running a company this way and we’ll see how long it lasts.

context

This tradeoff obviously can’t be evaluated without considering quantities. What percentage of your bug load is reduced, at the cost of how much development time?

That said, it’s not that simple. There are threshold effects (more than X amount of time = your company is dead) and intangibles (how do you feel while programming).

But anyone making a blanket statement without considering these things is participating in a dumb internet argument.


context

This article is gross. Very one-sided, no attempt to paint an objective picture. If someone did this about a person, it’d be scummy, but about a company it’s okay?

I am sure the company is messed up, but come on.


context

> (where there is already network delay an order of magnitude larger than the scheduler would impart)

Not the right way to think about it; the connection latency is irrelevant. What is relevant is that you need to play audio in sync with the video, and that audio is coming to you approximately simultaneously with the video it's meant to be synced with.


context

Hashing text is one of the easiest, most common uses of hashing.

As for your first example “you may be relying on a 3rd party service that compares things”, has this ever happened in the history of the universe?


context

Except yes it is, because all these software apps are built on prior “real technology” and would be impossible without it. The flow of capital just doesn’t reflect that (and it’s not clear it should).

context

If you can quickly iterate with a component system, you can iterate even faster just with regular procedures. Component systems aren’t magic (in fact they introduce friction).

context

The idea that you would “quickly hit limits” on a 1-programmer project is completely baseless. In fact a 1-programmer project is easier without an ECS as it’s one less system whose constraints you would have to comply with at all times.

Please stop saying this stuff as it just contributes to the general confusion.


context

It is fine that people are experimenting with putting graphics on the screen and playing with new languages.

But I just wanted to comment that you don't need an "Entity Component System" in a game, and especially not for a very simple not-yet-a-game like shown here. (You also don't need inheritance or composition).

It bothers me that so many people are buying into this hive-mind marketing on ECS, when in reality it is just overengineering + procrastination in almost all cases.

(None of my games have ever had anything as complicated as a component system).

If you want to make a simple game like this, just sit down and program it in the obvious way. It will work. You don't need to be fancy.


context

Also: From a market perspective, this gives agencies such as the ESA a choice about who to work with in order to launch manned flights. That's huge. It's a transition from no competition to competition, and a transition from technological stasis to forward activity. (Soyuz was first launched in the 1960s and we are still using it? It's kind of crazy.)

context

I think you are underappreciating this in a big way.

Prior to this, the USA did not have the capability to transport astronauts to orbit without working with Russia, and if Russia were to just decide no longer to cooperate for arbitrary diplomatic reasons, we'd be screwed. As you know, the shuttle program was halted some time ago and the vehicle was a total cow the whole time it was in service.

So whereas this has been done before on a technical level, this represents a substantial increase in the USA's actual present space capability, and a major step in reversing the decline of the USA's competence in space.


context

Previous discussion:

https://books.google.com/books/about/Bullshit_Jobs.html?id=i...


context

It’s not labor? What is your definition of labor?

context

Was there ever a time when a human being could subsist at any level of decency except through labor?

Why then would you expect humans to be adapted to such a situation?


context

Names for colors or kinds of automobile are also just labels we attach to whole spectra of related things, but the words still have meaning and we are able to use them productively.

context

What is this hypothetical system like, and how is it different from the ones that have been tried, whose proponents would describe in a way similar to what you just said, but have been shown to have the same emptiness problem you attribute to capitalism (as well as some others, like mass murder and economic collapse)?

context

It does in this case because it's the same problem in both systems.

He ascribed a problem to capitalism that is also a well-known problem of non-capitalism. The same thing happens in both places (actually worse under communism), so the blame is clearly misplaced.


context

If this is your thesis you’ll have to explain how e.g. Communism totally did not have this problem.

context

Right ... the point is, I can do higher-quality reading without carrying extra devices around.

context

I want a phone that I can read books on, and spend quality time with, rather than reading crappy stuff like Twitter. I fully expect to buy a foldable (not sure which one) and would easily pay $2500+ for it.

(It's not like I can buy laptops any more, since those are all garbage even when I pay $4000 for a supposedly high-end system, so I certainly have spare device money laying around!)


context

Browsers do not treat URLs as secure. If you just go to the page and happen to be live-streaming on Twitch or whatever, anyone can access the document because the information is printed visibly on the screen. This makes it starkly different from a password.

context

> What does this have to do with C++?

Given that most new C++ features in the "modern day" are implemented as std::whatever, in the "everything is a library" way, it's extremely relevant.


context

Why not? How do you know?

context

http://www.cs.virginia.edu/~robins/YouAndYourResearch.html

context

Force multipliers are definitely a thing, but I think that's a different thing. You can have both at the same time.

context

Why? The difference between average programmers and the best ones is actually much higher than 10x.

context

Be very careful from whom you take advice. If someone is not themselves 10x, it's unlikely that what they have to say is accurate or valuable.

The actual content of this article seems to be "we polled people who work on software, and they used positive adjectives to describe people they liked, and negative adjectives to describe people they didn't like". Then it ends with an advertisement for the author.

What any of this has to do with being objectively good at the discipline of programming, I have no idea.


context

Indeed.

If just setting integers is this complicated, what do you expect to happen when you are trying to solve real problems?


context

Many games are like that, but some very popular games aren't.

context

Gee, if only someone would question whether this whole “UB” thing is a good idea.

context

I would be interested in suing them if they have a page up for me. How do I tell if they do?

context

I would think twice about hiring someone if their previous place of work was a scam company, at least because it tells me something about that person's ethical compass.

context

It’s exactly the opposite. If you have skills that are rare (and exist at a level below demand), you have a lot of bargaining power. Unions are for people who didn’t have much bargaining power in absence of the union.

context

Under this logic every agency deserves to have a bigger budget until the budget is equal to the amount of money it brings in -- i.e. bloat up to maximum size. I don't think anyone wants that.

It makes sense to try to keep government agencies operating efficiently. Unfortunately I don't trust today's journalism to give me an accurate picture of whether what's going on falls into that category or not.


context

> I've no problem paying taxes, they buy me civilization.

They do, but this line of argumentation is kind of vacuous if you don't include the actual cost of taxes. If taxes could be 1/3 of what everyone is paying, thus buying the same civilization except much more prosperous since money is used a lot more productively and people have much more choice about where it goes, isn't that better? (This 1/3 isn't meant to be a real number; it's a thought experiment).

Alternative way of looking at it -- if this argument works without cost, then why shouldn't everyone just pay 100% taxes?


context

An IRS budget of $14B means each person in the USA is paying $43 per year to fund the agency (including children). At the Federal minimum wage, after taxes, that's about a full working day to pay it off. If you have a non-working spouse and one child, three working days.

So it seems like a lot to me.

Here's an idea, why don't we simplify our Byzantine tax filing processes so that the whole thing doesn't cost so much. I know, I know, all that sweet H&R Block tax lobbyist money is addictive, but it would be better for the country if congress would put down the pipe.


context

Union voice actors make hundreds of dollars per hour typically (a standard good but not-very-famous video game voice actor makes 2x scale).

context

It is pretty clear they are describing the Holographic Principle there. https://en.wikipedia.org/wiki/Holographic_principle

context

The original Bay Bridge was built in 3 years, with 1930s technology. We used to not be a bunch of incompetent idiots when it came to building infrastructure, but now we clearly are. I understand why people don’t want to pay large amounts of tax money to pay incompetent idiots to take a long time building tiny amounts of infrastructure.

context

‘judder’ refers to dropped frames in a fixed-frame-rate application, frames that are not dropped but are rendered too late, or, more generally, variance in frame rate of a program that should be smoothly animating.

context

It’s tax breaks. If you don’t employ enough people, you won’t be paying enough taxes to take advantage of tax breaks anyway.

context

The graph states that it is adjusted for inflation.

context

Of course it seems weird to think of Many Worlds as somehow manufacturing universes for each fluctuation. That seems weird and wrong ... because it probably is.

A better way to think of it is as the equivalent of the relativistic block universe. All these different spaces already exist in some superspace, and ‘random’ events take you from one space to a neighboring one. Nothing is manufactured.


context

They used them to research, develop and build every piece of technology you used to post this snarky comment ... ??

context

Many reporters have written many stories they knew to be false in the past, even without this incentive. So I am not sure what your claim is.

context

That's news to me... Also news to some of the biggest hits ever, like Minecraft.

context

Which is exactly why it's a terrible benchmark -- it doesn't tell you anything about what you care about. Which is one reason why it's a terrible article.

Another reason is that the author seems to have put a lot more engineering into the Rust program than the C program. Most of the word-count of the article is devoted to extra engineering on the Rust program! (The whole thing about sorting out lumps, etc). It's reasonable to infer that this also mirrors the situation prior to the first benchmark. If you put more effort into optimizing one program than another, you'd expect the higher-effort program to be faster, all else being equal.

It's embarrassing that HN thinks this article is worth posting. There might be something to write about here, somewhere, but it would have to be framed in a very different way in order to be honest. Also, it would be a much shorter article.


context

s/Newspaper//g

context

These instructions will be around for a long time, but their performance attributes will change in 5 minutes when Intel releases the next wave of processors.

I think given the current state of things it would be irresponsible for compilers to generate heavy instructions unless asked. Forget trying to be smart about it ... we already fail to be smart about things that are much simpler and more visible.

More interestingly, this may be what all CPU behavior looks like in 10 years, because if Intel has to resort to this kind of design now, why would hat change any time soon? Instead of worrying about primarily keeping the execution units full, people trying to write fast code may be primarily concerned with keeping them NOT full so that the chip doesn’t slow down. Which sounds crazy and hard to deal with.


context

I do. I am a Prime member but when Whole Foods cashiers ask I say "no" because I just want to get the bullshit over with and get out of there.

context

Not sure who is downvoting you. I charged my Tesla Roadster on 120v 15A for like 6 years.

context

Last I checked, most homes in the USA had 220v 30A dryer outlets.

context

Many physicists do not believe that collapse of the wavefunction is a thing.

context

> Household wiring will anyway not have enough current capacity to support charging

citation needed


context

This isn't really about minimal version selection. It's about how it is bad for your code to change behind your back ... which is a much more general and widely-applicable point.

context

The graphics APIs we use are often just about managing raw buffers of memory -- often different areas of memory that have different performance characteristics (Onion Bus vs Garlic Bus, yo). You simply can't do that under constraints of memory safety.

But beyond that ... "smart pointers" and the like make your program slow, because they postulate that your data is a lot of small things allocated far away from each other on the heap. I spoke about this in more depth at my Barcelona talk earlier this year.


context

This is true, but it mainly just mitigates the problem. It can never solve the problem, because the whole concept of this kind of scheme is that the memory manager has some volition of its own ... thus it can choose to do things when you don't want or expect (actually this is unavoidable).

Also note that this category of answer is basically saying, "look, if you mostly manage your own memory, then GC takes less time!" That's true, but a large part of the value proposition of GC in the first place was to remove the burden of memory management. Once you are saying actually, GC won't do that for this class of application, then really what you are getting out of GC is memory safety (provided the rest of the language is memory-safe). On the one hand, hey, memory-safety is a benefit. On the other hand, I don't think very many people in game development would trade that much performance just for memory safety.

(And in fact in game development we very often have to do unsafe memory things. So really what ends up being said is "much of the system has memory safety" which, really, does not sound very alluring.)


context

I get my number from decades of professional game programming.

In a 7ms frame, you are spending most of that frame doing the work of rendering the actual frame (unless your game is so trivial that the GC is going to be easy / fast anyway). An additional millisecond is going to cause you to miss your deadline and drop a frame. Dropped frames feel really bad.


context

8ms? Where do you get that number from?

If your player has a 144Hz monitor then your pause time, rounded to the nearest millisecond, has to be 0ms.


context

Except no, because rhetorically they are using the 62% figure as a good sign, rather than a bound on how bad things are.

context

Did they? I can't tell from the article!

context

> If an initial replication attempt failed, the researchers added even more participants.

This is an obvious way to tamper with the results; it's just more of the same kind of p-hacking that bad researchers are so often doing. They are using "we re-do the study with a larger population" as a way to re-roll the dice if the first die roll doesn't come up the way they want. (Note that if the die roll did come up the way they want, they don't re-do the study with a larger population in order to see if the replication fails).

Nobody should be taking this seriously.


context

There are some quotes missing from this title ... the proper spelling is Data “Scientists”.

context

Limited bandwidth implies limited data per month ... just multiply bandwidth by time.

context

Who said 120v? Plug it into a dryer outlet. My point was that you don't need all this big excessive expensive charging circuitry.

context

As someone who lived through the 80s, I will say the pop music of the time was not particularly good, but it was at least original. It was a new sound, and there were new aesthetics being explored.

That is not true today ... is it? Any examples?

By the mid-1990s we had a very clear idea of what 80s music was (electric guitar, synth and rap) and even what 90s music was (grunge and all this electronic stuff that I thought was super boring but a lot of people liked so whatever). It’s 2018 ... I have no idea what 2000s music was. Everyone can probably name a bunch of musicians that were big then, but what sound was being pioneered? Messing around with vocoders / harmonizers? That seems very limited and small.

What is the sound of 2010s music? I have simply no idea.


context

You probably don’t need a charging station. Teslas can charge from a wall outlet, for example.

context

Being a publicly-traded company sucks. I wouldn’t want to take my company public either, unless I had given up wanting to do interesting things with it, and just wanted out.

context

You have to work out a system, then, to figure out who pays to employ the people doing unprofitable work. Not as easy as it sounds.

context

The character of San Francisco is currently homeless people everywhere, and poop and needles all over the sidewalks.

context

I disagree. The slowness of these languages is mostly uncorrelated with increases in productivity. People only think there’s cause-and-effect here because they haven’t seen counterexamples, because the trend in language design for 25 years has been to make slow languages.

context

We all have these things in our pockets that are massive supercomputers by the standards of my college days. And we all use them just to rant at each other about political things that we in reality are totally misinformed about ... and otherwise just waste our lives. I am not sure why quantum computing is supposed to change this.

Even software that is supposed to be useful is so terribly slow. Computers are between 2 and 4 orders of magnitude faster than programmers today experientially believe they are, because today’s culture of programming has rotted so thoroughly. Do you really need a quantum computer when 3 orders of magnitude are just sitting there on the table waiting to be picked up?


context

Yes. This is all so obvious that I wonder why societally we have such a hard time seeing it ... i.e. what are we getting in return for the obvious willful ignorance we are engaging in.

context

The Supreme Court is just interpreting the law. If you don’t like their interpretation, amend the law so that it’s clear. If you aren’t able to get that done, maybe it’s because a lot of people disagree with you about what the law should be.

I am not much educated in this particular issue, but I have to say the language of this blog post led me to doubt that the author is a credible source of information.


context

You don’t have a debug allocator that just tells you what file and line made the allocation? Ouch.

context

Can you explain why the decision is poor? Because it results in fewer jobs (but still the same number of jobs Tesla had in early 2018)? Is number of jobs the only metric that determines whether a decision is good or bad? If so that would lead to an entire system of economics that is very different from the one economists know. So if you hold the secret to this vast trove of knowledge, you would be doing a disservice to humanity by not sharing your wisdom.

context

You are not going to count all the jobs he created in the first place, and the years of gainful employment those 3500 people had? It’s just all negative in your estimation?

context

As a customer, I hate interacting with unionized industries. Try, for example, having a booth at a trade show. It is completely awful, and most of that is because of unions.

> If you can only survive as a corporation that does the bare minimum to take care of employees you have no right to be an employer in the free market. Most corporations _can_ function with organized labor, but would rather not because it cuts into profits.

Think about this from a systems perspective. If you are a country that requires businesses to take a certain amount of overhead, then some percentage of businesses will just not be viable. Therefore your entire economy is x% smaller (at least). You can claim that this is offset by the economic well-being of the employees, but that's not at all obvious and would require a lot of justification. (If it were true, you'd expect countries like France to be more healthy economically than the USA, whereas in fact France is quite stagnant). If your economy is x% smaller, it means you are not competing effectively with other countries and have less leverage when dealing with them. And the shrunken economy has real consequences on the standard of living of people living in the country. etc, etc.

If you think of this only from a lens of "capitalists are evil people who deserve to be taxed to support the good workers" you are going to be missing most of the picture.


context

You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.

This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.


context

Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:

https://www.opengl.org/discussion_boards/showthread.php/1998...


context

Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]

Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.

[1] https://www.khronos.org/opengl/wiki/Legacy_OpenGL


context

Good riddance. OpenGL has been an awful API for many years now. The drivers are way too complicated, and applications don’t have enough control to deliver smooth performance. All OpenGL does now is let you kind of mediocrily put things on the screen.

context

I don’t get it either.

context

Government massively subsidizes the oil industry all the time.

If you have more EVs running, there is more incentive to upgrade the power generation structure, because it produces more environmental benefit.

Also, come on, the minority of generation comes from gas, but you are calling it "methane powered ICs by proxy". That looks like extremely motivated reasoning (to put it charitably).


context

Amen. I am so tired of trying to read these articles that spend so much time on random personal backstories because they don’t have faith that the core subject matter is interesting to read. The worst are the ones that jump back and forth between A/B/C plots, jumping away before any satisfying conclusion is reached on any of them, as a way of tricking your brain into reading the article ... like a really bad TV show, or Westworld.

All those articles can go burn in a hot, hot fire.


context

So is carbon offsetting really a thing, or not? If it is, well, these sites that tell you how much to give based on flight length say it is really cheap. So just make offsetting mandatory. The end.

context

California has good weather and spends a ton of money every year on homeless support services.

If you’re homeless, why wouldn’t you come to California?


context

Wait what? Amazon often packs multiple products into the same shipment. I assume each individual product is considered a purchase. And where do you get the $5-8 figure from? I can ship things for that much, so you’d have to assume Amazon gets a discount for massive volume.

context

But that’s not the real problem. The real problem is that the legitimate schools have been able to raise their prices tremendously because the students have all this easy money given to them for just that purpose.

context

When I watched the movie via a video player made from Redstone, I didn't realize it at first, but the inexact color reproduction really subtracted nuance from the mood, and the fact that I had to run away from monsters every night really prevented me from mentally investing in the movie. This was all really subtle, but later on when I watched the movie in a big theater, I was amazed at the difference!

context

The high student debt and unaffordable housing are almost entirely due to policies of the Democratic Party, a.k.a. the party that is supposedly trying to reduce income inequality.

Interventions often don’t do what you wanted them to do.


context

Because after octonions there isn't anything more.

context

The tone of the first paragraph should tell you everything you need to know about this article.

context

Why do you think that's a "typical baby boomer's idea of a vacation"? The people you were observing were 100% pre-selected to be the type of people who would go on a cruise. It's obviously not representative of the population at large.

context

You named some random things that have been around for short periods of time and claimed that "they won". I don't get it.

context

There are no sources listed anywhere in this article. The wording feels very intentionally vague. Not sure anyone should take it seriously.

context

That would kill HoloLens and DX13.

context

Baumol predicted this 50 years ago; it's called the "cost disease".

The inflation rate for a computer with a fixed set of specs is massively negative -- it gets cheaper every year. So it is for many technological devices.

But our whole economy is a mixture of technological stuff (that drops in cost over time, i.e. negative inflation) and non-technological stuff (burritos and health care).

The overall inflation rate is an average across the entire economy. Even if you believe the reporting is not distorted (which is dubious), then the fact that there are so many goods whose prices drop quickly over time, implies that there have to be many goods and services whose prices go up much faster than "inflation" would predict. Because something has to balance that average!

Baumol calls the technological stuff the "progressive sector" and the non-technological stuff the "stagnant sector". As time goes on, prices in the stagnant sector continue to rise until they consume almost all spending.

Baumol made specific predictions based on this model in 1960 that have turned out to be consistently true for 50 years ("the cost of healthcare will continue to rise to degrees that will seem scary" and so forth).

Furthermore, it's not like it is some weird complicated or hard-to-substantiate theory. It is just math, not much more complicated than the definition of the average. Given how big the consequences are, and how hard to argue with, it surprises me that this idea occupies so little of the public conversation.


context

How micro can it possibly be if it takes 10 people to deal with it? Seriously.

context

Those of us who have been around since the 80s-90s are astonished by the low productivity of today's programmers.

If it takes 1 person to run 1 microservice, we are all doomed.


context

> And yet, these games are being released today, are being funded by major and minor publishers and developers alike for tomorrow

And most developers making these kinds of games today fail to make their money back and go out of business.

I was in charge of one of the games you listed, so I know something about this topic.


context

Not everything is a conspiracy. I know the author of this article and he is a straightforward person.

The game industry is a big place with many different kinds of people in it, all of whom have different motivations.

If you build a picture for yourself wherein everyone is That Guy At EA Who Put Loot Boxes Into Battlefront 2, then that picture will probably get in the way of your understanding what everyone is being said by everyone else in the industry who is not that one guy.

This article is targeted primarily at developers, not at gamers. It is an "oh crap what do we do" article, because believe me, that is a big problem and in general we do not know what to do.

This was not some "cost of games need to go up because blah blah" justification article ... it is a straightforward look at what has been happening and some musings about what we can do about it, with "cost of games will probably rise" thrown in as a 6-word aside near the end (at which time he also points out that nobody wants to do this, which is true).

You're reading the article you want to read, not the article that the author wrote.


context

The cars are dirty and smelly. The upholstery you sit on is gross. Once in a while you will find human feces in the cars. Most BART stations do not even have restrooms, water fountains etc, so I hope you don't have any human needs.

The trains do not run often during the day. If you miss one, have fun waiting 20-40 minutes for the next one. They stop at midnight.

It takes a long time to get to your destination, and the destinations available are not a very large subset of where you actually want to go. Most of the time you will have to chain together another form of public transport on one or both ends, and the transfer time between different forms of transport is often long and pads out the length of a commute tremendously.

The stations and trains are unsafe, and people get attacked on them with some regularity.

On top of all these things, BART is very expensive!

I don't know how you could think BART was good unless you had never used any other comparable transport system.


context

I don’t think this is fully explanatory. I have lived in the Bay Area for almost 30 years, and, to borrow a term from our illustrious President, San Francisco is more of a shithole now than it has ever been. This has little if nothing to do with the ages of buildings (except of course that if we would build a lot more buildings anywhere nearby, things would be a lot more affordable. Why South SF is not on this like white on rice, I have no idea.)

context

BART is one of the worst public transit projects in the western world... if you ask "would we have been better off with BART or something else?" the answer is probably something else.

context

If it is not SpaceX’s fault in this case, what exactly do you think it is reasonable for them to do?

It’s weird that even your reply seems to assume it’s SpaceX’s fault. Why? (“I don’t like to see SpaceX fail...”)


context

Firing someone is speech? A whole lot of laws disagree with you about that.

context

Uhh, he had The Entire Internet doing that at him 24/7, why do you think there needs to be more?

context

If people "on the left" won't allow him to speak, where exactly do you expect him to go?

It's like the people on the right saying Edward Snowden is obviously a traitor/spy because he went to Russia. Uhhh... something very obvious is being ignored there.

Also, come on ... Joe Rogan is definitely not right-wing. Jordan Peterson and Dave Rubin do not self-identify as right-wing. And your [1] just seems a little extreme. "The far right is just a slip away from the reasonable right, therefore the reasonable right is dangerous?" But you don't apply that same idea to the left? Why not?

The real pattern here is that he went to the shows that would have him as a guest for reasonable discussion.


context

1. Sure, Tesla's whole plan, since the foundation of the company, was to start out making expensive cars and then to continually make them cheaper until they are completely mass-market. They have consistently followed this plan. So I don't see why you think they'd suddenly stop?

2. See 1.

3. Actually charging from 110v would take more than overnight -- several days at least if you are empty! It's very slow (it's a lot slower than half a 220v because a certain amount of power goes to overhead like cooling the batteries during charging). Even so, I got by on 110v for years. As for having (1) dryer socket ... buy a splitter cord for 5 bucks? I am not sure why you think you need a contractor and permits, unless you are just trying to fabricate reasons why electric cars are a problem.

4. See 1.

5. The market today is tremendously more developed than it was in 2008 when Tesla started selling their first car, and everyone thought electric cars were just golf carts that were completely infeasible. Why do you think this trend would stop now? On the luxury point ... see 1.


context

Electric cars are baller. They are fun to drive.

My Tesla Roadster does not corner like a Porsche 911, it's true, but it does not matter for everyday driving, because it is so effortlessly faster than anything else on the road, when I pull away from a stoplight, whoever I want to cut in front of is 50 feet behind me.

Instant acceleration is worth a lot, too.

For comparison: My previous car was a BMW M3. I would much, much rather drive my 2010 Tesla Roadster, and that is an old car at this point ... 2020 Tesla Roadster is going to be amazing.

> The range and the charging times are still an issue, especially if you don't live in California.

My roadster goes 340 miles on one charge. The upcoming 2020 Roadster goes 620 miles on one charge.

Admittedly, battery capacity costs money. Price-sensitivity is the reason most of these newly-announced cars don't have as much range. It is not a technical limitation.

And as with any such technology component, battery costs will continue to drop over time.


context

1. My EV (Tesla Roadster, updated battery) goes 340 miles on one charge. Admittedly it is a small car. The new Tesla Roadster slated for 2020 goes 620 miles on one charge. Admittedly that is an expensive car. But you should be able to get good range in a less-expensive car at that point.

2. Newer Teslas (Model S, Model X, Model 3) have charger planning built into their standard map program. About other EVs, if I were an EV manufacturer I would do a license deal with Tesla to allow my cars to use their charging network. But maybe they'll build their own. The one Tesla built is pretty good ... if a small company can do that, GM can do way more, and do it faster. So even though "there's no infrastructure" has been raised as this huge problem for years, it does not in actuality look like much of a problem.

3. Good electric cars have their own charging circuitry built in, so you just plug it into the wall. You can even charge off 110 volts, though it takes a long time. A dryer socket is much more reasonable. Just being able to plug in your car, in your garage, is much more convenient than going to a gas station. (This presumes you have a garage to park in. If you park on the street, different story.)

4. Electric motors and drivetrains are much simpler and much more robust than mechanical engines and drivetrains. Electric cars just do not tend to need repair in the same way. (I have had my car for over 7 years and it has never yet needed a repair of this kind).

5. Tesla Roadsters still hold their value very well, considering. But this may be in part due to the fact that it's a rare car. I am not sure about the Leaf, etc.

6. Maybe, yeah.


context

It’s unfortunate you had to take this ad hominem.

context

Have you seen A Dark Room? It's text.

He ported a text game, that someone else had already designed and implemented, to iOS.

This is not a huge amount of work compared to what most game development people do, and is not particularly challenging compared to what people are doing with 3D graphics, etc.

I am pointing this out not to be mean, but to respond to your point: "those who work as hard as he does and are as smart as him". What he did for A Dark Room was something pretty much any iOS programmer can do, and in those conditions it's natural not to expect to be noticed in a crowded market. i.e. he evidently did not do anything noticeably smarter or work noticeably harder than anyone else.

This doesn't mean he can't do smarter and bigger things, just that a port of ADR is not that, and neither were his other descriptions of projects he did in the meantime (for example, look at the screenshots for Mildly Interesting RTS).

I think when this is the limit of what one is attempting, one has no right to complain that people aren't buying one's stuff (and should not be surprised at that either!)


context

Yes, the game market is often hard. But I find this author's attitude fatalistic and weird. (Also I agree that the numbers do not add up).

He seems to think that "rolling the dice" is the primary thing going on, and that he can't influence this substantially by any amount of development skill, proper tactics, and so forth.

I wonder if this kind of mindset is a side-effect of modern-day political correctness -- that if you succeed, it is 100% due to your good fortune (combined privilege and luck) and not because you did anything special to get there. The upside of such an attitude is that we recognize luck and recognize that people who failed did not necessarily fail through fault of their own. One major downside is that we feel a lack of agency, and this lack of agency is almost certain to drastically decrease the chances of future success.

Be careful what you believe, because it matters.

For my part, I think talent and skill and grit are huge factors in success, so if I want to succeed at something, I am able to formulate some kind of a plan. I don't feel that I am "just rolling the dice", even when luck is involved (which it is all the time, in everything).


context

Tokyo subway is privatized and is very very good.

context

It’s positive coverage, but that doesn’t mean it’s paid. When game journalists like things, they write positively about them.

context

As someone who lived in SF for 24 years and is probably moving back, I will say that SF is a total dump and needs help badly.

context

The comment was a criticism of the US as compared to France. So you can’t write off the US as an anomaly because that comparison was the whole point!

context

Okay, but France is not known for its stellar economy or the prosperity of its people. Over here, at least, France is known for its stifling amount of government control when it comes to anything economic. It’s possible these things are related. (Just sayin’!)

context

I don’t think the Bartle stuff is useful in terms of concrete game design, the main weapon in my online game (from the 1990s!) was hitscan and not slow-moving projectiles as you assert (so I don’t think you know what you are talking about there), and the whole tone of your reply is personally hostile while asserting really weird things, so I don’t see anything else meaningful worth replying to, sorry.

context

I am talking about peoples’ home internet connections (or the PC bang where they play, etc). High-budget games these days locate servers around the world, so that is not an issue.

Yes, good connections are expensive for some people. If you want to do something about that, maybe you include amount of latency in a player’s ELO or something. I am just saying that I do not like Valve’s approach to the problem (which has been inherited by many other games) because I care about games and it makes games worse overall.


context

How popular is your online game?

P.S. Someone the other day asked why I don’t post much to HN (and don’t put much effort in when I do). This kind of junk is exactly why!


context

I don’t recommend lag compensation. It’s a bad idea because it degrades the experiences of more-committed / better players in order to cater to the less-committed / worse players.

context

Thanks for the nice comments.

But I only think this because I have had the experience many times in the past!

I feel the best place to focus my creative energy is on making livestreams, doing speeches, and just on day-to-day programming.

Putting effort into a posting here often doesn't take too much energy, but it does take some, and I'd rather put it into something bigger than mostly toss it away.


context

Exactly -- some kinds of games are soft-launch-and-iterate kinds of games, but some other kinds are very much not, and if you soft-launch them you make it tremendously harder to build an audience.

context

Not really worth it -- if I put in effort to write a substantial comment here, it's just buried in the noise of all these people who think they can give advice on how to build and launch a game.

context

I recommend being careful from whom you take advice.

context

There is something much deeper happening.

As you become successful in your field (or wherever), and further internalize the habits that are necessary to be successful, it's clear that many of these things are easy to do, it's just that people don't want to do them.

In other words ... it's obvious that many people don't want to be successful, and if they were to introspect deeply, they would see this clearly. In fact what they want is to be somewhere comfortable in the middle of the herd, not having to do too much work.

Most people want to be comfortable, not 'successful' in a way that requires ambition. But many people are brainwashed enough by the rhetoric of success that they don't realize it's not what they want.

There's also something I haven't figured out yet. Every time I give advice, I get a number of responses from people with self-defeating attitudes, explaining how this advice can't possibly apply to them because blah blah blah. These people build up belief structures that are obviously intended to keep them mired in their current situation, smelling of low self-esteem and defeatism. "Obviously" it's better not to be stuck in these belief structures, yet people will defend them vigorously, and in some cases fiercely. I don't yet fully understand why, except maybe that if someone believes there is a solution to their problem, then it must be their fault that they haven't solved it, and/or that there will be a clear failure that is their fault if they attempt to solve it.


context

I’ve noticed the word “gaslighting” being increasingly used. I’ve also noticed that my brain responds to use of this word by decreasing its expectation that the text or utterance is worth listening to.

Looking at the way it’s used in this comment, I would say it is ascribing malice where malice is unlikely, and would be implausible in the first place. So maybe my brain’s heuristic is justified.


context

You don’t need to do a generic AST. Have it be a protocol ... that runs in-process via an API. Because you know what, that’s what an API is ... a protocol for communicating with someone else’s code. But you don’t have to run extra processes or suffer context switches, and you don’t have to be in the business of debugging distributed systems in order to accomplish any tiny thing. Amazing!!!

This whole LSP thing is a mindbogglingly bad idea, brought to you by the same kinds of thought processes that created the disaster that is today’s WWW.


context

crc32

context

It's not that "the mapping" makes "the simulation" make sense. You have moved most of "the simulation" into "the mapping". So if you don't do the mapping, the simulation mostly does not happen, so there is nothing to make sense of.

context

The hole would be smaller, thus more likely to be affordable with wages / savings / etc.

context

A mapping is a computation.

If your mapping is very complex compared to the ostensible simulator, then the mapping is actually doing most of the simulation. So the simulation is not mostly running when the ostensible simulator runs, it is running when you perform the mapping.

If you are inside the universe being simulated and think you can do that mapping, it seems unlikely that there exist enough time and space for it.


context

I would pay $100/hr for a competent freelance system administrator who could parachute in and deal with the various things we need dealt with.

Of course the problem here is "competent". On two different occasions people have offered to do the job and they were terrible.


context

Exactly.

Most proponents of unit tests use horrible programming languages; they are afraid to change code because anything could break at any time. Stop using those languages and most of the problems 'fixed' by unit testing just disappear.


context

Condescending and unnecessary. How do you know he's a poor contract reader? Compared to whom, you? Do you know exactly what position he was in at that time? Have you ever been in such a position? Do you make art independently for a living, or even try?

context

More to the point, if this became the commonly-accepted sane way to do things, php would support it directly and it would be easy.

context

Hahahaha

So they wanted to claim to offer a high-quality product, but don't think they can actually do that.

Startup culture is so terrible.


context

It is my pet peeve that people believe this.

If this were actually true, why wouldn’t we have evolved an extreme aversion to touching our faces?

Also, have you ever seen a baby? What does a baby do if you leave it alone to crawl around?


context

The problem is that your car-driving case presumes the driver already wants to go to MV and is bringing his friends and they don’t mind coordinating to make that happen.

If you change your thought-experiment to a paid service, good luck being cheaper than CalTrain.


context

Why do people keep thinking they can use a garbage-collected language for “systems programming”?

context

It was a clickbait headline -- note that it has been revised.

context

> but what about the other countless number of people your age with similar goals? How do you stack up to them? That should be your real metric rather than the 1 in a million success stories.

I disagree with that. Most people are not successful, so if your target is the average, you are aiming at a data point that represents lack of success. It is important to understand that most successful people are not normal, and the higher the level of success, the less normal they are.

> "The best time to plant a tree was 20 years ago. The second best time is now."

Totally.


context

The article as a whole is nonsensical.

It is a little bit correct when it says that regret is not a useful reaction to a past you're unhappy with, but even that by itself is misleading. Regret is a useful emotion that helps you shape future actions. What is not useful is paralyzing regret, or any flavor of regret that keeps you wallowing in the past.

When he says "you are not behind", that is mostly wrong. If you're 25 and aren't yet doing anything individual and attemptedly groundbreaking with your life, you probably are behind, if that kind of thing is your goal. Sticking your head in the sand is not going to make this better. Being complacent and saying it's fine, I am only 25, no wait 26, no wait 27 until you are 40 isn't going to help either.

There is a reason the human mind is able to conjure phantasmal pictures of "where we should be" -- because that is useful. If you choose to ignore that in order to have a shallow feel-good time in the short term, you do so to your own detriment.

All that said, if you are genuinely content with where you are today, then everything is fine and you don't need externally-imposed images to tell you where you "should" be. This advice is only for people who deep-down want to build interesting new things.

My last comment is ... this seems like an excerpt from a self-help book written by someone who perhaps should gain further life experience before writing a self-help book. When you decide to write a self-help book you take upon yourself a substantial ethical burden, because if you give the wrong advice, you can affect many peoples' lives in a negative way. So you should make sure you really know what you are talking about.


context

But she's also not taking computer classes in her spare time, and work is approximately he same number of hours leaving approximately the same amount of spare time ... even if she were taking computer classes it would not be a reasonable comparison, because taking computer classes in the 1980s was a much more rare and advanced thing than taking them today. It'd have to be something more like quantum mechanics classes, or nanobiology classes, or, I don't even know.

context

It occurs to me, maybe this is another instance of the left engaging in silencing tactics and maybe that is a weakness in the flagging algorithm.

I can't vote "don't flag this". So if there are approximately two sides to a discussion, and one side wants to flag it to silence the discussion, then the discussion is going to get flagged no matter what.

So the side that wants to silence just selectively silences the opinions they don't agree with, and they win.


context

I don't think it's false and nasty. I was thinking the same thing ... quite objectively. There were at least two discussions that I thought were very reasonable, and they got flagged into oblivion very quickly. So I don't know why the YC link gets to be the exception. It feels wrong.

Maybe it is due to the users, but if that is so, it feels wrong enough to give me a pretty big loss of faith in the dynamics of the community.


context

This topic, a link to an NYT editorial, has been flagged. This seems highly problematic to me. Discussion here was pretty reasonable last I checked.

context

I am senior at my job and I disagree with that assertion. People skills are indeed helpful and a good benefit, but they are nowhere near primary. Consider the following two people:

(A) Understands people hardly at all, is constantly confused by hem, but understands computers very well

(B) Understands computers hardly at all, is constantly confused by them, but understands people very well

Which of these two candidates is going to be able to design and build a complex software product?


context

No, I didn't say anything of the kind.

I am saying that this kind of mob shaming-and-silencing mentality is deplorable. And if you choose to engage in that, then you will alienate a lot of people, including many of the best people.

I am not saying anything about government, and as I said in my previous posting, I find it weird that people keep jumping to this. We're talking about ethics, not law.


context

I find pretty unconscionable this line of rhetoric, which I see a lot lately from people on the left: "this isn't a free speech issue, because it's not the government doing the censoring".

I want to live in a society that embraces liberal values like freedom of expression. Preventing the government from encroaching on those values is a good idea. But if we then go and clamp down on those freedoms everywhere else, then it won't matter that the government doesn't do it -- nobody will be able to express themselves freely anyway.

This seems to be the society that the 'progressives' want and it disturbs me enough to have completely alienated me from that movement, and I am far from the only one, so I don't know why they aren't stopping and questioning the efficacy of this philosophy right about now.

If we are really a society that embraces liberal values, then we want those values to be upheld throughout the society, not just in the part explicitly controlled by laws.


context

Did you read the memo? It doesn't seem like it.

context

By this metric just about every single new business is a failure, including 100% of YC startups. At that point is it a useful distinction to draw?

context

For this to be the case, they would have to be applying the principle to everyone equally, which they apparently don't.

This has given pause to some people -- for example, Sam Harris, one of the top artists on Patreon, is evacuating the service (even though this is likely to cost him a fair bit of money) to avoid the future potential to be financially pressured over ideology.


context

> Am I being irrational here?

Not so much irrational as not particular enough; this line of reasoning doesn't really work.

The thing is ... at any one instant, the amount of memory you are able to recall / visualize / etc is very small, and if your mind is occupied by that thought, you won't have any thoughts about the present simultaneously. It is only the apparent continuity of time that seems to link these things.

So when you say "thinking about all the other things in my memory", well, you can't experience all the things that are supposedly in your memory. You can only experience one small part of that at any time. If there is only one time, then the other stuff does not exist, you just have confidence that it exists for some reason.


context

Yes, if it is creative success and not merely monetary success.

I am of a personality type that I don't think I could be happy without creative success (loosely defined as, having done a good job on creating things that would not exist if I hadn't made them). In a previous phase of life, I was not successful at making things, and I was pretty unhappy. Now I am successful at making things, and am much more happy (though I have also developed several mind-management skills as well).

If you are talking about "1m+" as the sole gauge of success, I don't think that means very much.


context

Yeah, the fact that you are getting downvoted is proof that the average HN voter does not understand software.

context

Just about any old language, including C post-ANSI. The main relatively-heavily-used offenders are Lisp and shell scripts of various kinds. (And I think "oops, you made a typo, you don't find out until you try to run it and happen to ht that branch of code, oops that did not happen for three months, who knows what other bombs are sitting there waiting" is one of the primary factors in Lisp not gaining wider adoption.)

I mean, C is less compile-time-checky than Pascal, but it is tremendously better than JavaScript in that regard. So is FORTRAN, etc, etc.


context

Valve has enough money in the bank to do a console launch and spend as much as Microsoft or Sony do.

It's just, their corporate culture makes them very conservative about spending money. SpaceX they ain't.


context

> Programming with new languages comes with more compile time checking and linting, less chances to make simple stupid typo-like errors that take hours to debug.

Most of the "old languages" did not have this problem. This was a product of the 90s when people built a bunch of stuff really fast without stopping to think about whether what they were building was really a good idea.

So, it's a couple of decades later, but that part of the 90s spirit is still in full force today, but there are many many more programmers. The logical conclusion is that the garbage dump being built today is way worse than the garbage dump built in the past, and that should scare you if you think typo debugging was bad.


context

It's not even usually "shitty behavior", it's just social interactions that are imperfect for one reason or another.

Concrete example: I see a lot of complaints from women about being talked over in conversations at conferences. I used to get talked over a lot too. It seems frequent that there are one or two people in a conversation who will talk over anyone with a less-pushing conversation style. I think most of these people are not doing it on purpose, they just really want to say things and don't properly gauge the social balance of the conversation. Also there are other weird human tendencies happening -- for example, someone habitually wanting to display how smart they are -- which, while being mildly negative, are not negative in a way that involves hostility to others.


context

Many times I have read internet complaints about how sexist and horrible some conference is, and thought, "wait a minute, I have had this exact experience several times -- and I am a white male."

context

I moved out of California because of taxes. Top tax bracket is now 13.3%.

context

Consoles routinely launch new machines with not-previously-existing operating systems and do fine. So "there aren't enough pre-existing Linux games" is not explanatory.

context

Several people have said this. Look ... a "crash" in a modern operating system is a recoverable exception.

context

I don't understand what you are saying here.

Why is it "significantly more brittle"? It is a well-specified interface. It is less brittle than talking over a socket because the kinds of points of failure involved with sockets don't exist in this case.

> And it can't be spec'd with a schema that isn't just "read the headers."

What does that even mean? It's a protocol just like any protocol, except you get the added benefit that for many languages it can be typechecked. Why are you claiming it can't be specified or that someone has to "read the headers"? What headers?


context

> An API that can be accessed from heterogeneous languages will involve IPC.

No. If your language cannot call into a dynamic library using a well-defined C ABI for your platform, then it is already failing to speak a standard protocol. Building all kinds of crazy, complicated, slow infrastructure in order to get it to successfully speak some other protocol, is a symptom of modern-day clueless programming.

> Particularly since the best API will use the compiler's symbol tables (avoiding implementing syntactic and semantic analysis twice, buggily)

Yes, this is of course a good idea. Why one presumes this requires a separate running process, I have no idea.


context

Just because the current way it's done is terrible, doesn't mean you should "upgrade" to a bad way instead of a good way.

context

This times 1000. AN API IS ALREADY A PROTOCOL but it doesn't require external processes, serialization, extra failure modes, etc, etc.

The fact that the HN community seems to have jumped aboard this idea, "yeah let's just require a server to do something simple like format text in your editor", is completely flabbergasting. People just seem to have NO IDEA how much complexity they are adding, and don't care.

Maybe in 5 years our machines will be running 10,000 processes at boot because people will want a server for every operation...


context

Chevy Bolt has sold 8,000 units total.

Tesla expects to produce and sell 80,000 units of the Model 3 in 2017.

We'll see if they hit that target, but come on ... in what reality can this be described as having "completely missed the ball"?


context

I disagree. That ratholing and reading is not resting at all, it is procrastination. If you want to rest, go to lunch or take a coffee break, or meditate or something.

> You can only concentrate so many hours a day. Without that resting your productivity drops.

This is the kind of thing people tell themselves to justify procrastination. If you are unable to concentrate for long, maybe you have damaged your attention span by too much internet browsing, and the cure is just to stop?


context

I posted my general thoughts on this topic yesterday:

https://news.ycombinator.com/item?id=14538761


context

I don't understand why HN readers are so eager to debunk the idea of 10x developer when it's obvious they exist.

For example, in this post you seem to be setting the upper bound around 3x. But actually, it is trivial to be 3x more productive than average: (a) Don't browse the internet while at work; (b) Sit there and spend your time working on the actual problem, not ratholing on programmer fixations that have nothing to do with the end result. Done. Congratulations, you are now 3x, before any consideration is made of experience level or talent or smartness or unique instinct or whatever else.


context

The whole reason I am bringing this up is because I have done 70+ hours of focused work on many occasions.

It's a more-than-linear improvement over 40 hours, because when pro-rated you have a lower density of context switches, getting-back-up-to-speed, etc, as happen in the morning or when you eat or such.

Maybe too many people have done long-term damage to their attention spans via the internet? I dunno.

At the risk of being mildly provocative ... are any of the people you know, who can't do more than 40 hours of work in a week, world-class in their field? If not, maybe there is a causal link between these two things?


context

> I have seen quite a few experts in their field and they didn't work long hours but everything they did counted.

Yeah, that is a much better position to be in than someone who is drowning in useless make-work all day.

But, after having reached this level of good set-up, one is now sort-of in competition with everyone else who has reached a similar good set-up. Well, even just ignoring the competition part, which is maybe a red herring, obviously you are self-gating how good you want to be by working more or fewer hours. So maybe these people decided a certain amount is "good enough" and didn't want to push past that, which is totally fine. But I am just raising the point that you can always push further if you want to.


context

This is all fine, but there are side-effects.

If you only work a minimum number of hours within your field, you are unlikely to emerge as one of the peak achievers or thought leaders in your field. That's just because you learn more from experience, and working more hours gives you more experience.

You can extrapolate from there what this means for companies and individuals.

I am not at all saying that companies should ask people to work long hours. (I run a software company, and we are super-lax about hours, people showing up at the office, etc). But I am saying that if an individual wants to be an expert in a particular field, that person should probably work a lot (and probably wants to work a lot anyway, due to interest in the subject). This doesn't necessarily have to be at the company; it could be at home, on personal projects, whatever. But the deeper and more challenging the project is, the better you learn, and it's easier to have one project that is deep and challenging than somehow to have two in parallel. And if only one is deep and challenging, then you are sort of idling with half your time. So there are basically two paths to this kind of deep work: work for a company, make sure you get a project that's really good, and then work hard on it; or go do your own thing, make sure you have enough money somehow, and work hard on what interests you.

This also means that "work-life balance" is not a thing for experts the way it is for normal people. But that's fine, because for these kinds of experts their work is a serious part of their life and the two things are inseparable.

Of course if you don't feel this way about what you're working on, that it is a serious part of your life, then this strategy doesn't make sense; and I would not encourage people who don't feel this way (who are the majority of the population) to work that hard. I am just pointing out that there are some of us for whom a different life strategy is best.


context

Indeed, it's a trivial lookup table problem, it's hard to come up with an easier one, so I would guess the confusion comes from lack of familiarity with bitwise operations, in which case why is the author presuming to write an article judging a certain implementation of bitwise operations impressive?

The intent of my comment is not to be mean, it's just ... there is a lot of noise on places like Hacker News and this article is part of that noise because, look, just about all compilers have had intrinsics for decades now at least, popcount is a very common one, so it's not surprising to see it turn up. It's not impressive as the title suggests, it's extremely common. And it's nothing specific about Rust because most production-quality languages do it. So both major elements of the article title are pretty much incorrect.

And it's fine not to know that when you're a beginner, I am not knocking that at all. But there's something about writing articles that then get broadcast, that give the wrong impression to other new people who are trying to learn. It's useful information that there is a popcount intrinsic in the Rust compiler, but this would be much more educational coming from someone who understands the context of all this stuff and can explain the real situation. Which may be the author of this article someday, maybe even someday very soon -- I don't wish to be inappropriately negative -- but it's not today.

I never liked going to school, and I think higher education is going to go through an existential crisis pretty soon, if it's not happening already. But one good thing about the old system is that at least there was this idea that you should work hard, and really learn the material, before you go presuming to teach people. And I think that's a very good idea. If you're inexperienced and there's a shortage of teachers and teaching needs to happen, then go for it -- but otherwise I think it is very important to keep in mind what one does and does not understand, and who understands it better, and to not presume to teach until one is in a good position to do so.

I know this goes a little bit against the current philosophy of "programming is great! Anyone can do it! Rah rah," but actually I think on closer inspection it doesn't. There's nothing wrong with participation, and community, and everyone contributing, etc. But it's important to keep an understanding of the difference between beginner contributions and advanced contributions, otherwise it seems possible to suffer a severe degradation of skill in the field over time, because how do people know what to shoot for if people of all expertise levels are teaching them and they can't tell the difference because they themselves are beginners?


context

This would not apply to the Rust method shown in the OP, since that one is operating on only 32 bits at a time.

It's unclear how well the benchmarks in this linked article generalize to other applications. If you are just popcounting in a tight loop, probably pretty well, but who does that? In reality you have other things going on, so if this method is occupying too many execution units or polluting your cache, you would see the effect of that on the rest of the program. But it's program-dependent, thus unclear.


context

But if you are reading a stream of bits, you don't want to read one bit at a time either, because that is pathologically slow. You want to read n buts at a time (where n varies each call, probably) at which point you're doing a very standard mask-and-shift with no magic...

context

I was once asked to come up with a table lookup method for popcount on the spot and could not come up with a solution.

Oh, Hacker News.

If someone can't solve a problem like this off the top of their head, does it not act as a strong signal that they are a beginner and you should probably look elsewhere for quality information?


context

This is why you have a fallback to a generic version.

context

Okay, wait. Every modern CPU has a popcount instruction, so any hand-coded implementation would use that, meaning the compiler output is actually pretty bad in an absolute sense.

But if you find popcount too "magical", the commonly-known fast way to count bits is via masking, shifts and adds, so that you do it in log(n) steps. Which also would perform much better than this solution.

So what you're really saying is "the compiler managed to make a pretty efficient representation of the naive solution" which is fine but it does not mean your code is fast.


context

I don't think it has to be explicit hate-of-the-job for burnout to occur. For me personally it was just lack of meaning in that job. I didn't, deep down, feel that it was very important, or the right thing for me to be doing. It was just the thing I was doing because I didn't have anything better to do.

context

I have run a company for 13 years and I don't have doubt about it at all. Dropping to 5 hours is for sure going to be less productive unless all your employees are terrible in the first place (in which case, just hire people who want to work more).

I am not claiming that the standard 8-hour day is the maximum; but I think if a shorter day is better, I would guess the situation would peak around 7 or 7.5 hours. But again this depends on what kind of people you are talking about. I personally work 60+ hours a week, most weeks, and I prefer it that way.


context

I had this same kind of personality / mental condition, and I am going to say, if he is really of the same personality type, 5-hour days are not going to help this author in the long term. What is helping his mood is not really the shorter day, but the hope of having made a short-term structural change that might fix things. The thing is, it won't. He already mentions at the end that burnouts are back. Well, pretty soon the 5-hour days will be feeling too long and he will be 'unable' to do them. Then what? 3-hour days?

The fundamental problem is that he doesn't actually want to be doing what he is doing, despite the rhetoric of "great team and awesome project". Come on, is that really how you feel about it deep in your heart, or is it empty SV rhetoric?

Two things will help this author:

(1) Strike out on your own, following your own motivation only. Yes you have to figure out how to make ends meet financially, but that is your lot in life. Fortunately it is easier to do this with computers than in most other fields.

(2) Meditate, learn to observe your mind and why it does what it does, so that you don't feel powerless or subservient to things like burnout. It's hard to explain the transformation that takes place, but being able to stand next to or outside these mental processes is very powerful.


context

It is simply not true unless you have a pathological idea of "hand-allocation" (which to be fair, some programmers do program like).

Let me put it this way ... all "garbage collection is fast" claims are saying the following thing:

"It is faster for the programmer to destroy information about his program's memory use (by not putting that information into the program), and to have the runtime system dynamically rediscover that information via a constantly-running global search and then use what it gleans to somehow be fast, than it is for the programmer to just exploit the information that he already knows."

It sure sounds like nonsense to me.


context

I work in cafes most of the time and seem to get a lot done.

context

If there's an accident (boom) then the paying customer's payload is destroyed and in most cases they will not have a duplicate payload just standing by to be launched. Building satellites takes a while.

context

Guys. Stay away. Tcl is awful if you want to write software that doesn't waste hours and hours and hours of your precious time hunting down simple bugs that would not have been an issue in most other languages.

- guy who used Tcl in like 1992 and helped write a 'compiler' for it, etc.


context

Yeah, identifying risk is important. But thinking one can "identify risk" as some random dude posting on Hacker News, criticizing a guy with a lot of experience running cutting-edge tech companies that make stuff like cars and spaceships (ACTUAL tech, not these lame web sites that people call "tech" these days), is ... well, delusional would be a polite word for it.

context

Dude, Photoshop takes many seconds to start up, and as of the most recent redesign it now often takes multiple seconds just to display the new project menu. And this is not atypical of today's software.

Forget 16ms, I would be happy to get to an order of magnitude slower than that for much of today's software ... it would be a massive increase in human happiness.


context

I think the sentiment still applies. Why is this a service instead of just a library I can link and use however?

context

I don't think "long-lived" is a good substitute for "extremely large". The longer something lives, the better it should be, for multiple reasons -- more time to work on the code, more design iterations, more-thorough understanding of the problem gained over time. If the code is just getting more messy and decayed and hard-to-deal-with over time, then we are doing something wrong. (And we almost always are).

I'm not just saying that people have lost track of the importance of efficiency. I am saying they've lost track of how to actually do it. I think at least 95% of the programmers working in Silicon Valley have no practical idea of how to make code run fast. Of the remaining 5%, a very small number are actually good at making code run fast. It's a certain thing that you either get or don't. (I didn't really get it when I started in games, even though I thought I did ... it took a while to really learn.)


context

But ... in a text editor you care about lines most of the time.

This is what I object to about the rope representation -- it intentionally destroys this information that you actually want available most of the time. I don't think that's nice at all.

It's possible you could make the rope work better in this sense by annotating each piece... I dunno, haven't thought about it.

As for the worst-case performance thing ... I think my scheme would do fine with super-long lines or 10 million line files. But dude, I don't even have an editor today that works okay on 10k-line files, and I don't think it's the internal data representation that's the problem, I think it's because of all the other decisions that get made (or lack thereof).


context

To get the parentheses right, you have to parse the language.

There is an extensive body of literature on parsing that goes back decades. Most of it I don't think is that useful. But some of it is about parallel parsing. If you are interested, there are quite a number of people with something to say about it. However, the speed wins in practice are not very big.

On the other hand, if you just write the parser so that it's fast to begin with, you don't really have a problem. The language I am working on parses 2.5 million lines of code per second on a laptop, and I have only spent a couple of hours working on parser speed. To do this it does go in parallel, but it goes parallel in the obvious way using ordinary data structures (1 input file at a time as a distinct parallel unit). So it's not "parallel parsing" in the algorithmic sense.


context

I am not trying to be anti-intellectual, but software currently has the opposite problem, where people decide some idea will Make Everything Better and it turns out that this idea does nothing of the kind. In fact some of these ideas have set software engineering back by decades (example: Object-Oriented Programming).

There are, of course, computer science concepts that are very smart. But we don't need these to save us from slow software, because today's slow software problem is just the result of people doing bad things in layer upon layer. We have to stop doing all the bad stuff and dig us out of the hole we're in, just to get back to neutral. Once we are back at neutral, then we can try thinking about some computer science smarty stuff to take us forward.


context

Oh you're that Raph! Hi.

Sorry for presuming age + experience level, it's how this came across to me. Actually I think Rush is a prime example of "excited about general ideas that turn out not to be right or relevant to much". But we were students, and I guess that is what students often do.

I agree modern editors are too slow and bloated. I would write one if I didn't have way too many other things happening. But I don't think they are slow and bloated due to a lack of computer science concepts. I think they are slow because most of the world, over the last 25 years, has lost the art of writing software that is remotely efficient.

If I were to write an editor, it would store text as arrays of lines (since lines are what you care about) with maybe one level of hierarchy, such that each 10k lines of the file are in one array. I think that would be fine and if it ran into problems with very large files, relatively minor modifications would take it the rest of the way. (Of course this is untested but I feel pretty confident about it). Rather than calling malloc all the time, a specialized allocator would be in play.

I do think it's a good idea to make a better editor so I wish you good luck with that (dude I am so sick of emacs).


context

Please read this with a grain of salt as it does not seem practical or necessary. It seems like the kind of thing written by a young person who is excited but doesn't really have much experience. Most of the ideas would not be real-world-useful as stated.

Excitement is nice to feel, but it takes some experience to know when excitement is really aimed in a productive direction. Otherwise we end up with the kind of motivation that so often produces over-complex and mis-aimed software: having a "cool idea" for "exciting technology" and then looking for places to apply it, and the applications don't really fit or don't really work, but we don't want to notice that, so we don't.

To pull examples: an entire one of these essays is on "paren matching" and how it would be really great if you monoidized (ugh) and parallelized that ... the basic idea of which is instantly shot down by the fact that language grammars are just more complicated than counting individual characters. Hey bro, what if there is a big comment in the middle of your file that has some parens in it? The author didn't even think of this, and relegates this to a comment at the end of that particular essay: "Jonathan Tomer pointed out that real parsing is much more interesting than just paren matching." Which is a short way of saying "this entire essay is not going to work so you probably shouldn't read it, but I won't tell you that until the bottom of the page, and even then I will only slyly allude to that fact." Which in itself is contemptuous of the reader -- it is the kind of thing that happens when you are excited enough about your ideas that the question of whether they are correct is eclipsed. This leads to bad work.

There's the essay about the scrollbar -- if you have a 100k-line text file, do you really want a really long line somewhere in the middle to cause the scrollbar to be narrow and tweakyin the shorter, well-behaved majority of the file? No, you probably don't! But this shoots down the idea that you might want to do a big parallel thing to figure out line length, so he declines to think about it. In reality what you probably want is the scrollbar to be sized based on a smooth sliding window that is slightly bigger than what appears on the screen (but not too much).

Besides which, computers are SO FAST that if you just program them in a straightforward way, and don't do any of the modern software engineering stuff that makes programs slow, then your editor is going to react instantly for all reasonable editing tasks.

I don't want to be too overly critical and negative -- these sorts of thoughts are fine if they are your private notes and are thinking about technical problems and asking friends for feedback. It becomes different when you post them to Hacker News and/or the rest of the internet, because this contains an implicit claim that these are worth many readers' time. But in order to be worth many readers' time, much more thought would have had to go in ... and as a result, the ideas would have changed substantially from what they are now.

I didn't read past essay 4, so if it gets more applicable to reality after that I don't know!


context

It was the "or indeed in C" that ended it for me right there (though that was compounded by the 'you want to load your 8-dimensional points from a file, where else could you possibly be getting them?' stuff, which shows that the correspondent has not done any scientific or geometric programming, or computer graphics, or video games, which are some of the main fields where performance matters most, so if someone is going to make a performance argument ... maybe he should actually know about performance programming).

If you want to talk about Haskell, fine ... I don't know anything about Haskell, though, and I am interested in high-performance programming, which is an area where Haskell cannot currently play (nor can any GC'd language). Making claims about how the performance of an operation in a slow language doesn't get any slower under certain circumstances isn't that interesting to me.


context

It's very sad. Halo was originally announced at MacWorld:

https://www.youtube.com/watch?v=6eZ2yvWl9nQ

but history did not go that way....


context

Then you get early warning that you need to be paying attention, rather than silently corrupting your data by swapping some coordinates around.

No, a real-world case is that I have a giant program that uses my own geometric primitives, and now I want to start heavily using a library. I know they are just fricking 3D points or quaternions or whatever. Yet because of some weird ideology you want to increase the amount of gruntwork I have to do, and make my life much less pleasant.

You don't want an 8-dimensional point literal that takes 8 arguments directly, that's never going to be readable. You might want to use a builder. More likely you want to load it from a data file or something on those lines rather than constructing it directly. Where are you even getting these 8-dimensional points from?

WHAT ARE YOU TALKING ABOUT

In Haskell (or indeed in C) it doesn't necessarily have performance implications; an 8-element structure may have exactly the same runtime representation as those 8 elements being passed distinctly.

Wow, okay, this conversation is over.


context

(a) This kind of thing in many cases is just tedious busywork that you are now making someone do every time they call the routine. What if they have a MyPoint and your procedure takes a YourPoint? And what if your point is 8-dimensional, anyway, what does that constructor look like?

(b) This has performance implications, not least because of the ABI. And depending on what language you are using, they can be quite severe (good luck if you are using one of those languages that always puts classes on the heap).


context

no more than 3 arguments for your functions

It is hard for me to believe that this is a piece of style advice that anyone writing serious software would follow.

If you need to do something wherein the basic task requires 8 arguments' worth of information (which happens A LOT) then trying to factor that into 3-argument pieces is going to give you something Byzantine that is probably also buggy (and it could get extremely heinous, in a way determined by the data dependencies internal to the procedure you are factoring). And if you somehow succeed at all this, congratulations, you just did a bunch of engineering that did not improve the functionality of your software in any way. (In fact it probably made the software take longer to compile).

If doing a certain job needs 8 pieces of information, it needs 8 pieces of information. It doesn't help anyone to try to break that up.

Similarly with this:

keep the complexity of the functions as low as possible

Not really. If you are just factoring some block of complexity in to 4 blocks of less-complexity, well, now you have the same amount of complexity as the original code, plus the complexity of the call graph, and the fact that the person who comes along to read the code will not be able to clearly see the control flow.

There definitely are many cases when factoring a procedure into simpler things is beneficial. But to claim that it's a good idea all the time, or even half the time, is I think mistaken.


context

No, because they share a base Entity class.

context

The way I do it, each entity has one struct. So a Bullet is one strict and a Door is one struct. They both share common base members that all entities share.

But this is all that is really necessary. Once you start getting into components, you add a lot of complexity (even if the pitch is that it's "simple").


context

"I'm curious what API you would use for implementing a table with varying row heights (that you only know upon rendering but can guess beforehand), sortable columns and millions of rows."

In general my policy is that when things get really complicated or specialized, the application knows a lot more about its use case than some trying-to-be-general API does, so it makes sense for the application to do most of the work of dealing with the row heights or whatever. (It's hard for me to answer more concretely since it depends on exactly what is being implemented, which I don't know.)


context

The reason this explanation is so widespread is that it is actually not false. However, the actual uncertainty in the Uncertainty Principle takes place at a much deeper existential level, in a more-important way, so this is a woefully incomplete explanation.

The actual uncertainty comes from the fact that the two quantities are Fourier transforms of each other... and just by that relationship, inherently, if one gets very localized (== very high frequency bump in its space), its Fourier dual gets spread out very far through space. (You can sort-of analogize this if you know about audio ... a sharp spike in temporal space, when transformed into frequency space, becomes a very big spread of values, because all those frequencies are relatively blunt and they have to somehow fit together to make this sharp thing, which requires an enormous number of them. Or if you go the other way, a sharp spike in frequency space means one frequency, which transforms into an infinitely-long sine wave in temporal space. So think about that kind of thing, except instead these are waveforms where the y value is kind-of the probability of getting that particular x-value as a result if you perform a measurement.)


context

If you want to successfully build a game, I recommend staying away from entity/component or any other overcomplex system ... until such time as you are an expert game programmer and know that's what you really want (such a time is likely never to come, for one reason or the other).

I have worked on games and engines using several different systems, and the only ones I ever enjoyed treat entities as plain regular structs that you operate on with procedures.

The games I have shipped all treat entities this way. And I never thought "I wish I had made the entity system more complex". The top N problems on our list, where N > 10, are always about graphics drivers or APIs.

Entities are hard enough when they are just structs. Don't insist on making them harder, or you are likely to shoot yourself in the foot when it comes to performance issues, later.


context

Most jobs are not in sales, so regardless of whether this is true, it only applies to a minority of positions. (Though I have no numbers, in my experience sales is the one area where there are lots of women in tech companies, as well as in some other industries like pharmaceuticals.)

I don't know that you can blame anything on "the fundamentally antagonistic world" since all businesses face a fundamentally antagonistic world and their goal is to overcome that.


context

In fact this is a reason to be skeptical of this kind of discrimination argument.

If the wage gap is happening and is really so drastic, if women are being undervalued so hard, etc, then there should be a massive Moneyball-style opportunity for people to start companies that correct this error. With the advantages you'd gain by adjusting hiring, you'd completely trounce the competition.

This hasn't happened yet though. Either people are being slow to do it, or the wage situation is not as straightforward as it is being put in these arguments.


context

You are talking to someone who has done 3D rendering professionally for 21 years. What's your background?

context

Yes, absolutely, and in fact I think it would be a much better program.

context

"so you need some data structure that sticks around between frames specific to the widget type, so that's what retained mode APIs like Qt do for their widgets."

Immediate mode GUI systems are allowed to keep state around between frames and the most-featureful ones do. The "immediate mode" is just about the API between the library and the user, not about what the library is allowed to do behind the scenes. The argument that retained-mode systems are inherently better at this doesn't hold water; it is kind of an orthogonal issue.


context

It is a little confusing because we are talking about both rendering and GUIs, but ... "retained mode" in this case refers to the GUI itself, not the method of drawing. Motif and Xlib are "retained mode" in the GUI sense because if you want there to be a button, you instantiate that button and register it with the library, and then if you want it to become visible or invisible or change color you call procedures that poke values on that instantiated button. In IMGUI you don't preinstantiate; you just say "draw a button now" and if you don't want it to be visible, you just don't draw it, etc.

context

[citation needed, in the form of actual benchmarks]

The thing is that list fusion and whatnot is all just there to get around the handicap that was placed there in the first place by the language paradigm. So you start by insisting on shooting yourself in the foot, then put lots of armor on your boot so the bullet hopefully bounces off.

I assume by "vectors" you mean arrays ... there is no case in which this can be faster than arrays, because in the limit, if the list fusion system works perfectly, it is just making an array. A thing can't be faster than itself.


context

I am assuming that your caret bar may be overlapping text in some way, or that there is a background bitmap that you might be alpha-blending against, etc. Basically I don't want to make an assumption that might break if the UI gets nicer. The case of a strictly opaque strictly rectangular non-antialiased non-smoothly-moving bar does not seem very interesting or nice-looking.

context

I will also say that this is not an academic argument for me; I am in the middle of writing yet another immediate-mode GUI right now, for the game editor I am working on. Every day I am freshly glad that I am doing things as IMGUI instead of RMGUI.

Here is a (somewhat old) video explaining some of the motivations behind structuring things as IMGUI: https://www.youtube.com/watch?v=Z1qyvQsjK5Y


context

Ehh, game engines are not really retained-mode in the way you mean. There isn't usually a cordoned-off piece of state that represents visuals only. Rather, much of that state is produced each frame from the mixture of state that serves all purposes (collision detection, game event logic, etc).

"What happens if you try to present an immediate mode API for UIs is the status quo with APIs like Skia-GL."

I don't know what Skia-GL is, but in games, the more experienced people tend to use immediate-mode for UIs. (This trend has a name, "IMGUI". I say 'more-experienced people' because less-experienced people will do it just by copying some API that already exists, and these tend to be retained-mode because that is how UIs are usually done). UIs are tremendously less painful when done as IMGUI, and they are also faster; at least, this is my experienced. [There is another case when people use retained-mode stuff, and that's when they are using some system where content people build a UI in Flash or something and they want to repro that in the game engine; thus the UI is fundamentally retained-mode in nature. I am not a super-big fan of this approach but it does happen.]

"and you draw strictly in back to front order so you completely lose your Z-buffer"

That sounds more like a limitation of the way the library is programmed than anything to do with retained or immediate mode. There may also be some confusion about causation here. (Keep in mind that Z buffers aren't useful in the regular way if translucency is happening, so if a UI system wants to support translucency in the general case, that alone is a reason why it might go painter's algorithm, regardless of whether it's retained or immediate).

"But that's the API that these '90s style UI libraries force you into."

90s-style UI libraries are stuff like Motif and Xlib and MFC ... all retained mode!

I don't agree that an IMGUI style forces you into any more shader switches than you already would have. It just requires you to be motivated to avoid shader switches. You could say that it mildly or moderately encourages you to have more shader switches, and I would not necessarily disagree. That said, UI rendering is usually such a light workload compared to general game rendering that we don't worry too much about its efficiency -- which is another reason why game people are so flabbergasted by the modern slowness of 2D applications, they are doing almost no work in principle.

Back to the retained versus IMGUI point ... If anything, there is great potential for the retained mode version to be slower, since it will usually be navigating a tree of cache-unfriendly heap-allocated nodes many times in order to draw stuff, whereas the IMGUI version is generating data as needed so it is much easier to avoid such CPU-bottlenecking operations.


context

It seems reasonable this might be true, but it's not. In video games we went down the road of retained-mode graphics APIs (declarative-type things, so that they can do the kinds of 'global optimization' you mention) but we abandoned them because they are terrible. Video games all render using immediate-mode APIs and this has been true for a very long time now and nobody is interested in going back to the awful retained-mode experiment.

context

None of the 3 things you said are true. I recommend you get some experience in rendering before you mislead people too much with these kinds of comments.

In reality the problem is trivial, you set up a scissor rect (or explicitly mask the pixels in your shader) and then render only stuff overlapping that square. You don't need to invert the pixels for it to be fast; you can render an arbitrarily nice cursor effect.


context

"With the GPU the only viable option for rendering blinking caret is to redraw the whole window."

Sorry, that is plainly false. There is nothing preventing you from treating an offscreen buffer just like any other buffer of non-dirty pixels. Treating the back buffer that way is slightly less conventional but is still just fine.


context

Having done TM as well as Vipassana (discussed in the original article) and other meditations, I do not recommend TM. You will do just fine (or better) with other things. TM happens to have some famous exponents (e.g. David Lynch) but I find it a somewhat cult-like money extraction machine, and the actual meditation they teach to be less worthwhile than the other flavors of meditation in which I have experience.

context

"it is very easy" but have you ever actually measured whether it does, and whether the resulting performance is high?

context

If you are solving an easy enough problem, then sure, you can use a crappy technical foundation. You can use anything if the problem is easy enough.

The point of engineering is to solve actual hard problems.


context

What I said is true for almost every kind of game. POV of the camera has nothing to do with it.

The main exception is low-number-of-player token-ring style games like RTSs with tons of units. Those usually simulate in lockstep, with the full state of the world extrapolated from inputs that consist of a very small amount of data. This means network traffic is relatively low, but in order for this to work you have to have complete knowledge of everything and exactly when it happened, which means no packet loss can be accepted and everything must be processed in order. So then you have the same kinds of problems as with TCP (even if the underlying transmission is via some other protocol) ... thus these games operate with some large amount of latency to hide these problems.

But, this network design is only the case for a minority of games. Just about any modern multiplayer game that is drop-in/drop-out, where the developers really care about quality of experience, is better off going UDP. (This is not to say that developers always do the best thing, since it's much easier to just say screw it and talk over TCP and call it a day. The temptation to do this is heightened because of all kinds of problems with NAT punchthrough and whatnot; because so much traffic is Web-oriented these days lots of routers mainly care about that, which causes all kinds of interesting annoyances. Thus games that do talk over UDP generally fall back to TCP if they are unable to initiate a UDP connection).

Well, there is one other case of games that run in lockstep, which is when they are console games made by developers who want to avoid incurring the costs of running servers (which are often much higher on consoles because the platform holder charges you out the nose). When you are running in lockstep like that it is more like the RTS scenario above, and thus it doesn't matter much if you use TCP because you are already taking the quality hit. But this is a cost-cutting kind of decision, not an it's-best-for-gameplay kind of decision.

P.S. It's not a good idea to call someone naive about a subject where you yourself may not know enough to correctly judge naivete.


context

UDP provides massively better quality of service for realtime data wherein there aren't usually dependencies between particular pieces of data.

For example, if you are transmitting the position of some guy in a world, N times per second ... and you drop one particular packet ... that's fine, you just get the next one and you have more up-to-date information anyway.

TCP will block the entire connection when that packet is dropped, waiting until it is received again, and not giving any of the subsequent information to the application. This is bad in THREE different ways: (1) By the time the new position is received, it is old and we don't care about it any more anyway; (2) Subsequent position data was delayed waiting on that retransmit and now most of that data is junk too, EVEN THOUGH WE ACTUALLY RECEIVED IT IN TIME AND COULD HAVE ACTED ON IT, but nobody told the application; (3) Other data on the stream that had nothing to with the position was similarly delayed and is now mostly junk too (for example, positions of other guys in totally other places in the world).

It is hard to overstate how bad TCP is for this kind of application.


context

SpaceX has 5000 employees and they design, build, launch, and LAND rockets.

context

Is your test mostly waiting on a mechanical hard drive? If so, then "2.5x slower than cp" could mean "a very large amount slower than cp" once you remove that overhead.

Whereas I have not done your specific test, I know that for the file sizes of executables I deal with in everyday work (around 10MB), the amount of time I wait for linking is woefully disproportionate.


context

Windows has incremental linking and it's still too slow. But it's faster than non-incremental linking, yeah.

context

Linkers are mind-bogglingly slow. I don't understand why they are so slow.

lld is still slow, it is just less slow than the other linkers.

This is not to disparage anyone working on linkers or say they are not smart. I think they just don't tend to be performance-oriented programmers, and culturally there has become some kind of ingrained acceptance of how much time it is okay for a linker to take.


context

10 years ago, the rhetoric was "electric cars will never work, because they won't ever be able to go 200 miles and there's no infrastructure, and they drive like golf carts, ha ha!".

Now it's "well it doesn't go 500 miles and it doesn't charge in 5 minutes, and okay, it is the quickest to 60mph of any production car in the world, but only on a full charge and that sucks!!!"

EVs keep getting better. Another 10 years from now, what will anti-EV people be saying?


context

Accusing someone of "virtue signalling" instantly ends any possibility for reasonable discussion, because it's a stark violation of the Principle of Charity. https://en.wikipedia.org/wiki/Principle_of_charity

On a meta-level, invoking "virtue signalling" is itself a signal that you want to ad-hominem the human subject rather than discuss and resolve the issues.

It is the opposite of rationality, and it's a bit weird that so many People Who Think They Are Rational are into the idea.


context

It's not specific questions that are important so much as the process. You want the questions to be tailored to the candidate.

See this example: https://www.youtube.com/watch?v=cfyWvJdsDRI


context

Hahaha, I guess it's a big misunderstanding. You wrote:

"Nobody really wrote software rendering like that beyond CG classes".

I read this as a claim that nobody in general wrote software renderers. When by "like that" you just meant using the specific techniques he used.

That said, I still have to disagree, in the sense that, to get to a fast software renderer, you start with a slow software renderer. Nobody does all the crazy optimizations a priori ... so stuff like a divide per pixel was common, say. Calling trig functions in inner loops is of course goofy, but my presumption is that in the next step of refinement those would be lifted out of the loops, because that is the way things are always done.


context

Wrong, wrong, wrong, wrong. Wrong.

People in the video game industry wrote tons of this stuff. We would spend weeks figuring out how to get one or two instructions out of the rasterizer or scanline converter, etc. I know this because I was there. I wrote several software rasterizers, and I learned how to do it by reading papers and magazine articles written by other people who wrote software rasterizers.

I have no doubt that other industries did so as well.

Even more recently, companies like RAD Game Tools built as products software rasterizers that are very fast (e.g. Pixomatic).

Also, what's in this article is a simplified introductory take. It is actually much much more complicated than this. (It doesn't look to me like he is doing perspective-correct shading, for example.) Also this guy's code is crazy slow compared to what you'd write in the real world, but hey, it is a tutorial.


context

But it's DATA SCIENCE. You know it's SCIENCE because they called it SCIENCE.

context

Sure it will involve time investment on my part, but I want a visualization tool to help lighten that investment and help me come to the understanding more quickly.

context

If you're going to say that, then hand-coded memory allocation is just an implementation detail for a garbage collection algorithm.

True in some sense, but mostly useless. Come on.


context

I am very interested in program visualization, but I think trying to visualize things at this level is just nutballs.

Anyway, it's the wrong problem. I don't need help understanding x * 3 + y. I need help understanding what these 30kLOC in these 17 files do.


context

This feels to me like moving the goalposts.

But come on, I have used emacs for 25 years, and on a daily basis it stalls for annoying amounts of time while I am just doing something simple like editing a .cpp file. Today. In the year 2017.


context

I agree ... it doesn't make any sense to me that an employer should have anything to do with supplying healthcare to employees. It just makes the system more complicated. I'd be happy with simplifying that out.

But this doesn't seem to be nearly enough to tip the balance in terms of operating a business overall ... I consistently hear from people how much it sucks to operate a small business in Europe. I don't see what's wrong in principle with having worker protections kick in at a certain company size, but that doesn't seem to be popular.


context

I know several people who have emigrated from some of the above countries to the USA, because running a small business is extremely painful (see for example France).

If you think about companies as "us versus them", where "us" is workers and "them" is giant faceless monolithic corporations, then your idea that enforcing worker protections is a high priority might make sense.

But most companies are small. I run a software company ... I am just a guy trying to get by, who now in addition to the normal-person's burden of making my life go, has to also make a company go, and that company provides jobs for 10-12 people.

If you make my situation much harder than it is, the company would cease to exist or would downscale to 2-4 people, shedding the majority of the jobs. I am not a faceless corporation, I am just a guy who wants to get interesting things built. My little company is certainly not set up to "exploit workers", especially not on an industrial scale.

Now, paraxoically, if you add a lot more friction to what needs to happen to run a business (regulation around hiring, firing, invoicing, etc), then people like me drop out, and then what you mostly have left is the larger companies who do want to exploit workers because that is just kind of how larger companies work. Plus then you lose all the innovation / energy / economic activity that comes from smaller companies. It maybe seems like not the best idea. (If it is, how come Silicon Valley is not in France?)

As an investor, I fund a small French company and I have seen some of the crap they have to deal with just because they have a handful of employees. It makes me very glad I don't live in France.


context

Definitions of class boundaries vary slightly depending on who you're talking to, but in general, "upper class" means you do not have to work for a living, plus you are plugged into the social network of upper-class people.

This is true for a very small percentage of the population.


context

When I was in college in the early 90s, unix systems were nonstop-paging quite often, and lisp systems would frequently pause while you were interacting with them, for nontrivial amounts of time, like multiple seconds.

And that was the 90s, after Lisp had been around for decades...


context

Usually I have upgraded, but I currently have no intention of buying an iPhone 7 or later, unless they fix something!

For me it's a double whammy. I don't like the lack of headphone jack, and whereas I feel like I could manage grumpily ... for me it kills the excitement of buying the new device, and I think that is important. (My standard listening headphones are Etymotic ER-4Ps; there is no way I am going to downgrade to AirPods).

But the bigger part of the whammy is iOS. iOS is completely terrible at this point. I just can't consistently control the phone. A large percentage of taps or swipes do things I did not intend (how many I'm not sure -- 20%? 33%?) It's just completely crazy. They need to get rid of 3D touch, get rid of double-tap one-hand accessibility mode whatever it's called, get rid of weird swipes from the edge, fix the horrible inconsistencies in the way autocorrect works (or, please, offer a system that just underlines words-thought-to-be-wrong without changing them, and let me tap on them to change them, or use the current autocorrect system but let me tap on a word to un-"correct" it, the fact that the current system just changes what I typed and gives me no recourse to fix it apart from laborious deletions and re-typings, which I often have to do 2 or 3 times, is just haughty and offensive)... and in the meantime, might as well redesign the rest of the UI. Because right now the phone is not a joy to use, it's a constant exercise in frustration. I haven't felt good about using iOS since sometime back around iOS5 or 6.

So it's no mystery to me why sales might be slowing ... I don't want a new one if it's going to continue the downward trend.


context

GC enthusiasts say this, but I have never seen it actually be true.

The reason is that whoever wrote the GC for your language has to solve an extremely general problem for an extremely large body of users with very different use-cases.

A memory management system for a particular program only has to solve the problems of that program, which is a tremendously simpler thing to do.

General-purpose GCs are like the F-35 or Space Shuttle ... due to the broad nature of demands they are very complicated, and are much more expensive and perform more poorly compared to specific solutions.


context

I am a programmer with a lot of experience who has written a lot of things from scratch (not in Haskell).

I think most documentation sucks and I dislike trying to read it, because it's very hard to get a picture of what's going on, and what these procedures / data structures / etc are really for.

I am much happier when I can just look at a straightforward and clear example, and then just use the documentation to look up specifics of how things work after I already get the basic idea.

It's not because I "need worked examples of things to understand them", it's because that is the way I like to work, because I have had many instances of my life of trying to make sense out of documentation that seems to have been written from a mindset of "formal writing involves not actually telling the reader what things are really for, straightforwardly". I don't know why that disease is so common, but almost all documentation is like that.


context

This class of problems falls into a weird space for me. They are 'toy problems' but not I don't feel like I'd have fun doing them.

I had a lot of fun playing Shenzhen I/O and TIS-100, so if you want to do some programming challenges, I'd recommend trying those.


context

No, those are only seen as "ways to solve the problem" by the functional-style programmers who created the problem in the first place.

Stack allocation is what your computer is built from the ground up to do. It is not some kind of workaround or optimization, it is how software was originally designed to work.


context

I think at this point you are abusing terminology such that it's meaningless.

I am not just talking about putting copies of things on the stack, but having few copies to begin with, etc.


context

Sure, but it's also worth pointing out that the class of programs that behave this way is the class of programs that are inefficient to begin with.

If you handle strings, and you are copying and freeing strings all the time, that's just slow code.

And it's not the case that GC makes the slow code faster... it's that certain techniques enable GC to not fail catastrophically on slow code.

It's a little bit disturbing that folks are ready to extrapolate this to some kind of universal rule.

In a high-end video game, for example, we hardly allocate anything short-term at runtime. The whole program is architected to avoid it. When we allocate, it tends to be things with medium-to-long-term life (texture maps, sound effects, render targets, whatever) so a generational system is useless there. In fact we don't use GC on these kinds of things at all, because we also need to control exactly when they are deallocated so we can put something in their place, because memory is limited.


context

All the Hacker News kids are crazy about microservices right now.

In three years, people will have realized it wasn't such a good idea, and moved on to the next "this will solve everything" fad.

Once you have seen enough of these fads go by, it's pretty obvious.

I'd be careful about extrapolating future directions of computer science from something that seems cool this year.


context

Yeah. I find it a haughty article because he's not acknowledging the elephant in the room, which is that GC is not a solved problem despite all the whiz-bang techniques ... and that the Go people aren't being dishonest or ignorant of these techniques, but rather, judging those techniques not to be so useful for the class of programs they care about.

context

Your figures are way wrong.

I have an original Tesla Roadster, bought in 2010, with a battery that is basically the first thing they figured out how to do in order to put a car together. (The Model S battery is much more advanced). I drove the Roadster daily for 6 years, and I had about 12% capacity loss after those 6 years. This was a much better situation than Tesla projected (I don't remember what they said at the time, but it was something like 30-40% loss at 7 years, and for a relatively low price they sold an optional battery replacement plan that kicks in at 7 years).

Supposedly the Model S's chemistry is much, much better. Just saying "they're lithium batteries" is kind of a red herring, because there are many many subclasses of lithium battery, and at least according to Musk the fact of lithium is not nearly the most important part, but what really matters is the composition of the cathode and anode: https://chargedevs.com/features/tesla-tweaks-its-battery-che...

[Edit: And the theory that they would have preemptively hobbled the car's maximum range by (.85^6) is just crazy, because it means they could instead have advertised a car that had THREE TIMES THE RANGE on its initial launch, and "range anxiety" was one of the biggest issues they had to overcome. They could have said OUR CAR GOES SIX HUNDRED MILES ON ONE CHARGE, which would be way more important than hiding some degradation.]


context

Spoken like someone who does not ship fast software!

context

Which is why high-end games do not use generic system malloc; in general we link custom allocators whose source code we control and that are going to behave similarly on all target platforms.

(In fact we go out of our way to not do malloc-like things in quantity unless we really have to, because the general idea of heap allocation is slow to begin with.)


context

I don't understand your reply. How is this not a non sequitur?

context

Exactly. I would avoid using calloc simply because I don't know what it actually does.

context

If you don't know whether or not you are really getting an optimization, then how much do you really care?

If you really care, then you actually profile your system and see what takes how much time, under which circumstances. The results of such a profile are almost always surprising.

I guess this is a basic cultural difference -- almost nobody in the HN crowd really cares whether their software runs quickly; there is just a bunch of lip service and wanting-to-feel-warm-fuzzies, with very little actual work.

In video games (for example) we need to hit the frame deadline or else there is a very clear and drastic loss in quality. This makes this kind of issue a lot more real to us. If you look at the kinds of things we do to make sure we run quickly ... they are of a wholly different character than "guess that calloc is going to do copy-on-write maybe."


context

Yes, and that is exactly my point.

The article says you should use calloc because it provides these optimizations. I am saying no, that's goofy, because it is not specced to provide these optimizations.


context

Sorry, but this is just goofy and bad.

If you depend on copy-on-write functionality, then you need to use an API that is specced to guarantee copy-on-write functionality. If that means you use an #ifdef per platform and do OS-specific stuff, then that is what you do.

Anything else is amateur hour.

If copy-on-write is a desirable feature, then as the API creator, your job is to expose this functionality in the clearest and simplest way possible, not to hack it in obscurely via the implementation details of some random routine. (And then surprise people who didn't expect copy-on-write with the associated performance penalties.)

This is why we can't have nice things.


context

Zaretskii's stance is weird. If you are going to run out of people who can work on the core of the editor's source code, then the editor will die. So the lack of ability to work on the code is the real problem. This is probably because it has accreted way too much complexity at this point, and way too many hacks. Shedding some of those hacks is a very good idea.

If you wanted to keep it the old way, and depend on the nuances of how an allocator stores memory, then ship your own allocator. Video game people do this as a matter of course; it's not a big deal.


context

If it were through willingness of the governed, then individual members of the governed who did not agree would be able to opt out ... ??

context

I think int is 8 bytes on the PS4. (I just got bitten by this...)

context

The way socialism puts people before money is via government control of resource allocation. That is by necessity authoritarian (even in the USA).

context

He's not assuming anything. He's saying garbage in, garbage out. How can there be a reasonable discussion if we don't even know what the thing is we are trying to discuss?

context

You are saying this free will stuff because you are conditioned to believe in free will. Someone who knew that there was only one future would find the idea of free will to be like some kind of fairy tale -- the lack of it doesn't require any more explanation than the lack of Santa Claus.

In our current non-fictional universe, where we don't know if there is free will, then if you want to believe in it, maybe the burden of proof is on you to explain how it would even be possible given what we know of the universe.

So I am not sure why you think a fiction author is failing at is job by not explaining the mechanism by which there is not free will.


context

I said shipping code, by which I specifically mean that which is left after all deletions (of which there are many).

context

> the architecting and consideration of interplay between components takes the bulk of the time.

Which is supposed to be what is simplified as LOC goes down.

So if a supposed 5x-10x code reduction (which I've never seen real evidence of) doesn't lead to 5x-10x productivity increase, how much increase is there supposed to be? Surely more than zero?


context

> would often take 10+ LoC in other languages can be succinctly expressed as 1-2 lines.

Every time I have heard this kind of claim (with modern languages), it turned out not to be true except for trivial code or straw-man bad code in the 'bigger' language. So if you have real-world examples that have real-world effort put in, I'd like to see them! (I would be happy to be wrong.)

5x-10x productivity increase would be huge if it actually existed; it would be so unstoppable that everyone would switch to the new really-great language immediately. That hasn't happened, which should be a clue that maybe the increase is not there.

Even a 20% decrease in cost of engineering would be so large as to be unignorable.


context

10,000 lines is supposed to be 'big'? wtf?

I write something like 25kLoC/year (of shipping code, generally very complex stuff) and I don't even program full-time. The two projects I am working on now are 35kloc (the smaller one) and 250kloc (the medium-sized one).

If someone thinks 10kloc is big, I have a hard time thinking of that person as a professional programmer.

(Numbers listed here exclude blank lines and comments.)


context

All the studies mentioned in this article are exactly the type that are currently being debunked en masse and generally causing a crisis of faith in psychology.

context

I think such people just aren't paying attention.

Everyone who understands American economic policy knows that the currency is being slowly devalued on purpose. This is not a conspiracy theory, it is common knowledge. The inflation target is always greater than 0. This is in part because of the perceived risks of deflation -- better to be on one side of the line than the other -- but also, generally, the point is to encourage people to spend or invest rather than passively save, because spending and investment grow the economy.

To a libertarian this is one of the most oppressive things about the way the government works currently ... it forces everyone to work more than they would ideally have to, in a sense. (But I say "in a sense" because if the economy were at a much less active level as "normal" maybe everyone would have lower quality of life. I don't know.) If you ever wondered why Ron Paul dislikes the Fed so much, well, it's because of reasons like this.


context

It takes CPU to decompress audio, and games can easily be playing 100+ SFX at once.

context

>> Good programmers understand how malloc works.

> "Good programmers know how their GC works. What, are you kidding, or am I misunderstanding?"

I think you are not understanding what I am saying.

You link your allocators into your code so you know what they are. You see the source code. You know exactly what they do. If you don't like exactly what they do, you change them to something different.

A garbage-collector, in almost all language systems, is a property of the runtime system. Its behavior depends on what particular platform you are running on. Even 'minor' point updates can substantially change the performance-related behavior of your program. Thus you are not really in control.

As for your other examples, apparently you're a web programmer (?) and in my experience it's just not very easy for me to communicate with web people about issues of software quality, responsiveness, etc, because they have completely different standards of what is "acceptable" or "good" (standards that I think are absurdly low, but it is what it is).


context

Seattle, and we'll see how that goes.

context

San Francisco is just kind of stupid at this point. I have lived in the Bay Area since 1989 and I am leaving on Tuesday. Whatever benefits there are to being here do not outweigh the huge monetary cost and substantial degradation of quality of life.

(And before you can thinking about paying rents or whatever, if you are in the top tax bracket, enjoy paying 13.3% state income tax, which, after Federal tax, is a staggering 22% of your income.)


context

> Perhaps I'm misunderstanding, but do many C programmers understand not only the current state of malloc at any given moment in their code but exactly how it works?

Good programmers understand how malloc works. What, are you kidding, or am I misunderstanding?

Performance-oriented programmers do not use malloc very much. As you say, you can also try to avoid allocations in GC'd languages. The difference is that in a language like C you are actually in control of what happens. In a language that magically makes memory things happen, you can reduce allocations, but not in a particularly precise way -- you're following heuristics, but how do you know you got everything? Okay, you reduced your GC pause time and frequency, but how do you know GC pauses aren't still going to happen? Doesn't that depend on implementation details that are out of your control?

> even though in practice SO many things do not deliver on this promise in shipped products!

But, "in practice" is the thing that actually matters. Lots and lots of stuff is great according to someone's theory.


context

It's not unusual in completely terrible code.

The difference is if the memory management is manual, you have the ability to clean it up and reduce that overhead toward 0%.

If it's a system-enforced GC, you are limited in what you can do.


context

Waterfall is a straw man.

context

I wish most of the people who talk about the Simulation Argument understood this.

There is not even a reason to believe that the "outer universe" has such things as space and time or information as we know it, and no way to know what a "computation" might comprise in such a situation.

Maybe the situation is not that pessimal, and an outer universe is much like ours, but to prefer that believe one would need evidence, of which we have none.


context

#9 is especially stupid because it's so context-dependent. SSE4 gives you a popcount instruction, for example, which would be easily the fastest way to do this, if available.

context

Wait I just realized maybe a lot of kids these days don't know what assembly language is and that's why one might say this "doesn't have a lot to do with programming"???

context

I am not sure how you can say these games "don't have a whole lot to do with programming". They are programming.

Maybe you didn't ever have the experience of programming on an 8-bit CPU and don't get the joke? The machines in these games are comically limited, "this is like the cruftiness of an 8-bit CPU but even worse". The funny thing about TIS-100 is that it's a speculative fiction game, postulating an alternate reality -- what if we had gone down the path of multicore CPUs back when they were still super-primitive?


context

Dude what are you TALKING about. Astronauts have even died during rehearsal events that were not real launches!

https://en.m.wikipedia.org/wiki/Apollo_1


context

Gee I guess all those times the US space program had things explode were 'unacceptable mistakes' to you as well.

context

"Not rational enough"? What are you talking about?

I don't want to think about what time it is when I go on Facebook or whatever, because I have more important things to think about. There is nothing irrational about that (though the premise that increased rationality is something to be desired in this context is deeply questionable to begin with and maybe slightly creepy.)

When an ISP can offer a plan that requires me to think as little about it as possible, that simplicity is a valuable service.


context

If we made any mistakes, they are pretty small.

What gets super confusing is that you have a bunch of different stuff flying around. You have textures in different formats and render targets in different formats (some are in sRGB, some are in HDR 16-bit floating-point, some are other random formats somewhere in-between). You need to set up your shader state to do the right thing for both the input texture and the render target, and the nuances of how to do this are going to change from system to system. Sometimes if you make a mistake it is easily spotted; other times it isn't.

And then there are issues of vertex color, etc. Do you put your vertex colors in sRGB or linear space? Well, there are good reasons for either choice in different contexts. So maybe your engine provides both options. Well, now that's another thing for a programmer to accidentally get wrong sometimes. Maybe you want to introduce typechecked units to your floating-point colors to try and error-proof this, but we have not tried that and it might be annoying.

All that said, everyone is about to rejigger their engines somewhat in order to be able to output to HDR TVs (we are in the process of doing this, and whereas it is not too terrible, it does involve throwing away some old stuff that doesn't make sense any more, and replace it by stuff that works a new way).


context

Actually no, there are plenty of excuses, because this stuff gets super confusing and it's easy to make a mistake.

context

It's the sarcasm tied to apparent lack of desire to discover or communicate actual information.

There are entire organizations devoted to assessing the effectiveness of various kinds of charity and measuring how many lives they save (e.g. http://www.givewell.org/ and https://www.givingwhatwecan.org/), and their reports can be found within 20 seconds of googling, less time than it took for you to type your based-on-no-actual-information sarcastic judgement.

Sarcasm plus knowledge would be fine.


context

Being able to get these errors without running the code is almost all the benefit.

It enables a whole class of highly effective program manipulations that are just unavailable to a non-statically-checked language.


context

It sounds like you use web languages or other bad programming languages? Aggressive refactoring is not a problem in statically typechecked languages, in fact it is common.

We test the hell out of our stuff, and it works way more reliably than most web sites I have ever seen. But we don't do it with unit tests, because unit tests are not very useful in complex systems, because they do not test anything hard!


context

This is fine if you believe that unit testing every 40-line chunk of code is remotely worth the time and effort. I don't think that is true for most applications.

How long does it take you to write and test all those tests? Could you have been doing other things with that time? At 40 lines of functionality, the tests are going to be at least as big as the things you are testing (??), so what kind of a multiplier are you taking just on lines of code written? How much does that cost?

[I run a software company where I pay for the entire burn rate out of my own pocket. So these questions are less academic for me than they are for many people.]


context

When you factor it into a bunch of 40-line things, is it really less of a mess, or is it just that you can't see the mess any more -- maybe it looks clean, but if you pick up the rug, the room is filled with dust?

I think also what you're talking about is a function of programmer skill. I think if you have a good programmer write a 1000-line procedure, and a bad programmer write a 1000-line procedure, you are going to get drastically different things ... just like with anything.


context

It's the procedure that constructs most of the puzzle panels in the game.

Usually I just search for the name of the puzzle I want to edit (which is also how you'd do it if it were a ton of different procedures).


context

Obviously, he's a different person and has a different opinion.

My experience has been that people on HN tend to interpret that part of the posting a little more extrapolatingly than I do. I think he is saying something pretty obvious, which is that when you can structure things in terms of pure functions, you don't have to worry about the side-effects that are one of the main issues you need to contend with when factoring things apart.

This is different from being a "fan of functional programming", i.e. believing you should use current functional programming languages to build your projects, or whatever.


context

What size of codebase are you talking about?

If it's 100,000 lines of code, and you break stuff every 40 lines, you have now introduced 2500 procedures many of which don't really need to exist. But because they do exist, anyone who comes along now has to understand this complex but invisible webbing that ties the procedures together -- who calls who, when and under what conditions does this procedure make sense, etc.

It introduces a HUGE amount of extra complexity into the job of understanding the program.

(Also you'll find the program takes much longer to compile, link, etc, harming workflow).

I regularly have procedures that are many hundreds of lines, sometimes thousands of lines (The Witness has a procedure in it that is about 8000 lines). And I get really a lot done, relatively speaking. So I would encourage folks out there to question this 40-line idea.

See also what John Carmack has to say about this:

http://number-none.com/blow/blog/programming/2014/09/26/carm...


context

This article is a ridiculous hit-piece. Ugh.

From reading it you would think Tesla was some kind of failure of a company, rather than a miraculous startup that has done what no American car company has managed to do in over 100 years.

"You see, the fact that Tesla has 400,000 preorders is actually a sign of failure!" Yeah, tell me more...


context

I have to completely disagree ... What are you even talking about? The inability to have circular links is deeply hobbling; they arise naturally in all kinds of circumstances.

context

Where did you learn CS that they didn't have lots of cyclic data structures? We had those even in our intro class at Berkeley...

context

Is this post just an ad?

context

This is not true. You can't factor a quaternion into two complex numbers. Please don't spread misinformation.

context

Haskell has been around for over 20 years. If it hasn't set the programming world on fire yet, there are probably reasons.

context

Yet phones, which only need to draw text and flat bitmaps, are laggy and stall all the time.

context

It's correct that you can look in any direction with only two planes, but that's not enough; you need to be able to control your orientation around that final axis. You need 3 planes for that (3 "rotational degrees of freedom"). The idea that a quaternion is somehow two complex numbers is wrong.

context

"Typically, desktop applications for each operating system are written in each's native language. That can mean having three teams writing three versions of your app. Electron enables you to write your app once and with web languages."

I stopped reading there. If you have total ignorance of how native applications work, maybe fill that hole before trying to evangelize that everything should be written in JS...


context

There's another factor: MMA fights are one-on-one in a controlled environment. If you get into a fight in a bar and take your opponent to the ground, his friend is going to kick you in the head and now you have a concussion. MMA fights and the styles that succeed most frequently in MMA are based around this kind of maneuver, and it works because you know you are fighting one dude and nobody else is going to bother you.

context

Wait, you are comparing 4000 lines of code to 150 lines of code? How does that make sense?

If you modify your example to 400 nested 10-line function calls, how does that change your comparison?


context

Due to aliasing problems, you have exactly this issue in C and C++ as well, yet those languages are both massively faster than the languages in question.

context

It is because, if Tesla allows it to happen, the media will attack them incessantly over these things because they smell a story (and because a lot of vested interests are willing to pay for PR). If you don't defend yourself in that kind of situation, you die.

context

There's free source code for basic Unicode operations all over the internet.

context

Care to explain why you think it's hard? (For someone with a basic education in programming, say, someone with a bachelor's degree from a reasonable school). Exactly what part of this problem is hard?

context

I watched that talk when it appeared on the HN front page, and I actually think the whole methodology he is talking about is misguided. I don't find any of the "incremental program understanding" stuff in Visual Studio to be useful at all. I wish it were not there because it only causes problems and distractions.

It's a case where some people are choosing to do something that is a lot harder than a straightforward parse ... but as a user, a straightforward parse is actually what I want.

That said, even if you thought this was the right way to go, I am not sure that the internals of their code would look anything like the kinds of parsing tools you are talking about, so I am not sure it supports your point in any way.

> And again, I'm not claiming that ALL parsing is hard.

Parsing is easy. The video you link above is harder, but that's not really parsing any more, it's more like "make sense of this text that is sort of like a working program", which is more like an AI problem.

But anyway. It's pretty clear you haven't written many parsers (or any) so I am going to stop arguing. If I were to "win" this argument I wouldn't get anything out of it. I am trying to help by disavowing people of the notion that certain things are harder than they've been indoctrinated to think. If you don't want that help, fine ... just keep doing what you do and the world will keep going onward.


context

Well, I think a lot of web programmers do not really know how to program.

If someone is going to be offended that a potential employer asks them to reverse a linked list in an interview -- something that seems a bit trendy in the web world lately and several such articles have made the HN front page -- then look, that person does not really know how to program, so of course they think it's hard to do basic stuff. Such a person's opinion on how hard things are is not that relevant to how hard they are given a reasonable background education.

Probably this sounds snobby to some people, but look, programming well is a never-ending pursuit, you can spend your whole life getting better, but it won't help anyone advance if we all pretend that everyone is good already.


context

"Try adding an autoformatter like gofmt or IDE completion for Jai, and see how your parser changes (or if you have to write an entirely new parser)."

It would not change at all, and I have no idea why you think it would, except to guess that the model you have in your head of a hand-written parser kind of sucks. They don't have to suck.

"...not knowing what language you've designed." I have no idea what you're on about here either.

Look, I think you are making things a lot harder than they are. I am not bragging ... I used to build lexers and parsers by hand 23+ years ago when I was a student in college and had almost no programming experience compared to what I have now. It is not hard. If you think it's hard, something is missing in your knowledge set.

(I also built stuff using parser tools 23+ years ago, and was able to very clearly contrast the two methods. Parser tools have gotten slightly better since then, but not much.)


context

Well, you just need to call unicode_next_character all the time instead of saying s++, similarly for whitespace, similarly for asking whether a character can initiate or continue an identifier, etc. It does not change the basic nature of the task at all.

context

If you can't write a lexer by hand, just forget trying to write a compiler that does anything interesting, because the lexer is MUCH easier than any other part of the compiler.

There are a lot of reasons for this, but one of the basic ones is that the lexer does not need to interact in a complex way with the compiler's state. It is a relatively simple pipeline where characters go in one end and tokens come out the other.


context

I don't know what Ohloh is but it looks like it's some kind of web software, in which case I am totally unsurprised.

Are you confident that the same programmers could have successfully built a line-counter if they built it using parser tools?


context

Precedence is actually not hard. The way I do it is:

(1) Parse everything as though it were left-to-right.

(2) After each node is parsed, look at its immediate descendants and rearrange links as necessary. (Nodes in parentheses are flagged so you don't rearrange them.)

I can tell that the person above who is listing off a bunch of reasons not to use "recursive descent" hasn't written a compiler by hand ever (or not well). Most of the things he is talking about are easier to do by hand than in some complicated and relatively inflexible system.

Note that 'prediction' is mostly a red herring since you can look as many tokens ahead as you want before calling the appropriate function to handle the input. You would need to have a pathologically ambiguous language in order to make that part hard, and if your language is that ambiguous, it is going to confuse programmers!

In general, parsing is easy (if you know how to program well in the first place) and is only made more difficult/inflexible/user-unfriendly by using parsing tools. That doesn't mean that academic theories about parsing are bad -- it's good that we understand deeply things about grammars -- but that does not mean you should use those systems to generate your source code. (I do think it's a good idea to use a system like that to spot ambiguities in your grammar and decide how to handle hem, because otherwise it's easy to be ignorant... But I would not use them to generate code!)


context

Yes, it is hand-written. The parser is MUCH easier to build than the interesting/new parts of the compiler are. Lexing is the absolute easiest thing, then parsing is 2nd place.

context

Text parsing for programming languages is NOT a difficult problem. It is very easy actually, much easier than most academics would have you believe.

What they are doing is trying to write theories and build conceptual systems about how to do things. That is their job. But when it comes to practical matters, the best route to take, as someone who wants to build a working compiler that gives good error messages and where the parser does not hamstring the rest of it, is to ignore almost all that stuff and just type the obvious code.


context

Except no, lexing is trivial; it is by far the easiest part of writing a compiler. You don't need anything fancy, you just type in the code.

context

The problem is that games designed to "stick over many months & years and monetize the player" are generally garbage, as far as the quality of the actual game goes.

Of course these things are relative or subjective and what have you, but it's pretty rare for people who have a lot of experience playing games to seek out stuff on iOS because of the great quality of games there.

If you build a system that incentivizes garbage games, that's what you get, and well, that is what we have.

Fortunately if you are someone like me, who wants to make actual good games, there are still platforms where you can do that, and do quite decently money-wise. I am hoping those don't go away.


context

Indeed, I am kind of surprised at how many in the press are backing Gawker, after Denton straight-up threatened to use the site (and presumably others) for further blackmail. It is crazy.

context

That's not the problem. The problem is any other live pointer could point into that array, for all the compiler knows. Unless the array was generated locally and never aliased, then if you ever call through some set of procedures that the compiler doesn't have complete visibility into, then it is possible that that code generates a pointer into that array and then uses it. The compiler has to be VERY conservative.

context

You can't pass an array to a procedure without it becoming a pointer. But that is not even the issue ... the issue is that even if you know something is an array, any pointer of any type could be pointing anywhere into the interior of that array.

context

No, because the comparisons are usually "baseline version of algorithm I want to beat" vs "highly optimized and hand-tweaked version of the algorithm I have a vested interest in."

context

"... in the next phase, you too will be subject to a dose of transparency. However philanthropic your intention, and careful the planning, the details of your involvement will be gruesome."

http://gawker.com/an-open-letter-to-peter-thiel-1778991227


context

No. The way to ensure that your store works atomically is to specify the assembly output of the compiler. Reading what it is outputting right now doesn't matter, because how do you know when it is going to change its mind?

context

The Carterfone was an early exception because of its functionality.

context

It's a complex history, but it started with the government's prevention of what would have been an even bigger monopoly.

https://en.m.wikipedia.org/wiki/Kingsbury_Commitment

But for the most part, it is just that AT&T kept buying smaller companies, which is just what happens in capitalism when one party starts to win, which is why checks on capitalism are necessary.


context

"absolutely disastrous"?

I am 44 years old, which means I remember growing up at a time when you were not allowed to own a telephone -- because AT&T exercised its corporate monopoly to control what you could plug into your AT&T phone line, and they would only permit that to be an AT&T phone, and they would not ever sell you an AT&T phone, they would only rent you one at an exorbitant price. And they didn't bother to provide you any variety in models, because why would they? There's one phone, that is what you get.

Also, if you wanted to call someone in a different area code, then I hope you are ready to shell out some cash...

If it weren't for state-exercised power, it is quite possible that things would still be this way.

I do not consider today's situation a disaster at all, relatively speaking. (For sure there are still many un-ideal things about it.)


context

Are you serious? The headline was "Peter Thiel is totally gay, people."

context

I think their bald contempt for the legal process had a lot to do with it:

http://gawker.com/a-judge-told-us-to-take-down-our-hulk-hoga...


context

I think the problem goes deeper than "don't publish sex tapes".

I have had a substantial number of news blog stories written about me, and I think among the general population of news bloggers there's a lack of professional ethics of the kind journalists supposedly used to have. Certainly not all bloggers are bad; some of them are upstanding, but really the majority are not.

When the incentive is just to get the most hits, it is very easy for a blogger to present a quote or situation out of context, or even for an editor to slant a headline in a certain way, in order to make the maximally inflammatory result. When this happens, it is parasitic behavior -- they are degrading your reputation in order to make money. But the amount of money they make off that article is small compared to how much you value your reputation, so the result is massively net-negative to the world.

This has happened to me A LOT so I have a pretty well-tuned sense for how it happens. I also have a pretty long list of journalists and outlets I won't do interviews with ever again.

The issues get pretty subtle. For example, it is common for them to take a one-or-two-sentence aside from an interview and write a ehole article about it, making it seem like you called a press conference just to say that one thing -- which is a massive distortion of your intent (and usually your personality). Because they want the most hits and people being enraged makes hits, it is usually a negative distortion. And it's intentional -- they are trained to look for these opportunities. I think it is very unethical, though of course there is nothing illegal about it -- you did say that exact thing.

I think as long as that is happening, it is hard to take these sites seriously as producing "journalism".


context

It seemed even-handed to me.

context

Same experience here. I stopped going to Apple Stores because the experience was so stupid.

context

"a derivative copy of well known software idea"

As someone who has been around the web from the beginning, I will tell you this is horse shit.

Andreessen was building Mosaic at NCSA when very few people knew what a web browser even was. (There were only a few browsers in existence at that time, most of them were unusable, and the most popular one displayed only text. ftp was still massively more 'popular' than the Web, and in fact so was gopher ... gopher, FFS.) O'Reilly hosted what was basically the first WWW conference, in New Orleans, sometime in 1992. The attendance was about 40 people -- that is how big a Web conference was at that time. Marc was there. (So was I). People were mad at him because Mosaic was hacking the IMG tag into HTML without waiting for everyone else to discuss and agree on a standard.

So yeah, you are denigrating someone despite having no idea what you're talking about. But hey, I guess that is par for the course on an internet forum.

Also. Marc actually had hair in 1992!!


context

By people paying to read the article.

context

Except not really because Vulkan is apparently not very good.

context

Companies always start small.

Your argument is basically "the small company is small". So what? Do you propose that business can work in some other way? How else would you propose that it works?


context

First it was "electric cars are not viable technologically, they have no range, there is no charging infrastructure, etc."

Then it was "electric cars are too expensive, they are rich peoples' toys, and mainstream consumers will never want them anyway."

Now it is "Tesla will not be able to scale up to meet all the demand."

The fact that EV naysayers have been forced to cycle through this spectrum of responses in less than 10 years should be a clue of some kind.


context

They also cut the cost of launch heavily when compared to ULA ... and ULA also benefits from all the advantages you describe.

context

You act as if it's somehow not easier to commission the remanufacture of parts for a well-understood plane than it is to manufacture a whole new, untested plane.

Come on, seriously.


context

You're missing the main thrust of the argument, I think (and this is maybe a little expected because the article kind of presupposes it).

Imagine a hippie concept such as "you are effectively the same being as that guy over there, if either of you gets hurt, it's isomorphic, it's equally bad to the overall organism".

Now, imagine that this is objectively true, i.e. there is something in the basic laws of reality that, if you could observe it, would show the hippie idea to be obviously true.

Then perceiving this part of the laws of reality would be anti-fitness, so you would evolve to be blind to it.


context

Good point!

context

[q]In this case it means giving a voice to a racist advocate of slavery.[/q]

This is possibly disingenuous, and at least overly rhetorical.

They are not "giving him a voice" to talk about anything related to anything racist, and I'm sure if he used his slot to talk about anything racist, he would get perma-banned from the conference.

It is hard to say more than this without just repeating things said in the conference's statement. The idea is that a professional society ought to be able to cohere even when the members of that society disagree on matters outside the subject at hand. It seems like a good idea.


context

$133 billion per year! That would buy us one F-35 fighter jet per year!

context

The point is that if you can't do this, there is no way you can do stuff more complicated than this, which will actually be necessary during regular work. (And which you won't be able to type into Google).

I run a software company and I'll say straight up I would not hire someone with your attitude.


context

Anyone who downvotes replies like this is displaying ignorance of Microsoft's pattern of behavior through their entire history.

Sure, maybe it's different this time ... But usually it isn't.


context

Except ... reversing a singularly-linked list recursively is trivial.

If you can't do this, you are not qualified at basic manipulation of data structures and you should fail the interview if they want someone with basic competence in data structure manipulation. Sorry but that is how it is.


context

This analogy is correct, except ... you're not Robert De Niro, and neither is almost any programmer in the valley. Just like almost no actor is Robert De Niro, and most actors are lucky to even be able to audition for commercials.

context

If your friend was not very senior (10+ years experience), I would not believe what he says, since he probably did not have enough experience to judge the situation.

I think the idea that the game industry is "behind" other fields is kind of comical, given that games are some of the most complex software in the world, and big game teams have only a few hundred people on them, and meanwhile something relatively trivial like Twitter has 4000 people. It's true that game teams don't do a lot of Agile or TDD or whatever the next buzzword is, but that is because those things are mostly superstition and obviously don't work when you start attacking hard problems.

So if you are someone a few years out of school who learned TDD it is easy to say "games are behind, they don't do all the new stuff!!" while being unaware that almost all the new stuff is bogus cargo-cultism anyway.

I do agree that the game industry engages in unhealthy levels of crunch that are to its long-term detriment, but this is mostly an orthogonal issue to software engineering practices.


context

Yeah ... I don't know what PrlConf is (and the link does nothing to explain) but LambdaCon's policy and statement about it seem totally well-reasoned and fine.

So cancelling whatever Prl is seems like yet more internet outrage culture that we would be better off with less of.


context

Actually, it is easy, if you had the foresight to put in versioning. Which they didn't.

context

Yes, the right thing is to port your language's standard library to whatever operating system you want to run on ... just like libc did.

You could have a dummy version that just calls out to libc, for compatibility with systems that you haven't finished porting to yet.


context

Oh, but I mean in Rust generally. Ideally any language that is not C would not depend on libc.

context

What is the roadmap on getting rid of the need for libc? Given how terrible libc is, I personally would make that a high priority, though I guess in Linux you can't even start up a process without libc (maybe that is a misunderstanding?), which makes the situation less clean, but at least you could get to a point where you never call back into it after entry into main.

context

You guys know you really mean "Web API" and not "API", right?

This kind of phraseology is a clear sign that someone's programming experience is extremely narrow in scope.


context

There is no contradiction.

Yes, I am saying that most plans on how to do things better are not right. Doing things better is often pretty hard.

But there always is some way to do better. The way you find that is you keep trying a lot of things until you build up an experience-based picture of what things are really like. As you get better at this, plans you formulate become more likely to be net-positive.

What I am saying is that TDD strikes me as a pretty terrible plan in the first place, the product of this kind of ideas-untempered-by-serious-experience.

Speaking for myself, I am pretty sure my own productivity would plummet were I to adopt TDD, and in fact I would completely lose the ability to build software as complex as I do; I would drop at least a level or two there. This does not necessarily speak to TDD's suitability for anyone else, which is why I am recommending to judge by output.


context

It is important to read the rest of the sentence in order for this comment to really make sense.

I am talking about any scheme of how to do things that is intended to provide benefit. These all start with "wouldn't it be better if X, because Y" and then a plan is made of how to bring this about.

Well, this plan is inevitably imperfect, so it is either that you don't get all of X, or the reasons Y were not correctly understood or accounted for.

Then, there are always some extra drawbacks that creep in that negate some of the benefits. Usually these drawbacks are very subtle, and they can be hard to notice because they are not things that the plan was trying to address.

In the end, usually the net result is negative: the scheme causes more damage than it provides in benefit. But usually it takes a long time to understand this clearly, because the drawbacks can be subtle (but sometimes they aren't, for example, in TDD, how much extra code you are writing all the time).


context

For many paragraphs, I thought this was a parody of TDD defense, but then it turns out it wasn't.

His 'defense' of the point is basically: Look, when you do TDD you have to put a lot more work into the tests than you thought! It is not just a simple thing!

Okay, fine, but ... Before embarking on TDD, the programmer had a picture in his head of what the costs+benefits of this change would be. Now you are telling him the costs are WAY higher. So a successful defense would have to then make the case that the benefits are also WAY higher.

But he doesn't. Because the benefits aren't higher, in fact they are lower (as is the case with every well-intended scheme in the history of anything.)

As usual my advice on this is: look at the people who build things you find highly impressive, and study how they did it. This is much more fruitful than reading the output of people who want to spend all day telling you how to program (which leaves very little time for them to build software that is impressive, i.e. they never even test their own ideas!)


context

This is legendary. Most people (including me) would have thought this would not be possible for decades.

9-dan is the highest rank in Go. It is not possible to play against anyone higher.

So I am not sure why you think it isn't a big deal.


context

This is one reason among many why people who write serious low-level code (e.g. game developers) think all the new aliasing rules are completely bonkers.

We implement our own allocators all the time. If you can't even do such a basic thing legally, then the rules are obvious nonsense.


context

> by the standards of 1995 are pretty damn amazing, responsive, and beautiful

Yeah no. If you had gone back to 1995 and told me that gmail was what you would get when I have a supercomputer in my pocket, a super-super computer on my desk, and all web pages are served by SUPER-super-super computers, I would have quit the industry out of depression.

It is some horrible bullshit when you look at it in perspective.

About the quality issue, no surprise that I also disagree there: the web is especially crappy.

I do not consider any piece of software that I use to be performing acceptably (native or web), but there is a stark difference between the native apps and the web apps, in that the native ones are at least kind of close to performing acceptably, and also tend to be a lot more robust.

Web apps not working is just the way of life for the web. Any time I fill out a new web form I expect to have to fill it out three times because of some random BS or another.

Look at all the engineers employed by Facebook and especially Twitter. WHAT DO MOST OF THOSE PEOPLE EVEN DO? Obviously the average productivity, in terms of software functionality per employee per year, is historically low, devastatingly low. What is going on exactly??


context

I think if we decide heavily siloing / sandboxing is the right thing for software generally, then what you want to do is build an operating system that works that way (kind of like iOS, but with provisions to enable better data sharing so that you can actually make things with that OS).

This would be TREMENDOUSLY better than trying to make the browser into an OS.


context

The language is just too complicated at this point. Any time you want to add something you need to ensure it plays well with everything else. It is a huge amount of friction.

context

> Having a uniform experience for all clients

An experience that is uniformly slow and uniformly broken a different way on every browser...


context

Dude my first professional programming experiences were on a 486/33. Compared to that a P1/133 is pretty darn fast!

But as you say, there is not much point debating subjectivity here. It's not like I had the foresight to record benchmarks of how long it took web pages to appear, or to open a window, etc, back in the mid-90s.

Edit: How about if I put it this way:

If you go back in time to the 90s and tell everyone "20 years from now, we will have a much more advanced web where EVERYONE WILL HAVE A SUPERCOMPUTER IN THEIR POCKET", people would imagine the web would be amazing, and responsive and beautiful, and we would be doing some seriously intricate stuff.

Instead ... no, we have a pile of junk that only kind of works, and slowly at that. In terms of potential unreached, the web is kind of a massive failure. (Yes, it is "successful" in the sense that we are able to do a lot with it that we could not 20 years ago, but the mediocre is the enemy of the good, and all that).


context

As someone who has been around since before the Web, I can confirm that computers today do not feel any faster... despite the fact that your phone is faster than the fastest computer in the world from that time.

In fact I gave a speech about this at Berkeley last week. I think it'll be online pretty soon.

So now you have at least heard someone claim this.


context

Did you read Microsoft's reply to Sweeney's article? They basically used sideloading as an excuse for why this isn't anti-consumer.

context

Today you can do that.

This is a boiling-the-frog kind of situation. They do just enough to get people today to accept what they're doing, then the next steps come later.


context

The Microsoft plan is to put all future development into the closed platform and thereby let the open platform die of old age.

context

Even if you think she failed, AND if you think this was her fault, she now has a LOT more experience at being a CEO than someone who has never been a CEO.

context

If it were "yesterday's battle", I would have a good programming language to use in my domain, without having to make one. But I don't.

Experimenting with model-based programming or whatever other future programming paradigm is healthy. I think we should do a lot of that, because the way we program hundreds of years from now hopefully doesn't look that much like today. BUT, you have to also be aware that there's a reason why these are future paradigms and not current paradigms, and that people building real programs today need to do something that works today. There is no way we could have built The Witness in any model-based system known today.


context

Wikipedia is wrong. It is not naive.

Yes, this does not solve every possible stack overwrite. But look at the number that have actually happened in the field and whether this would have dramatically reduced vulnerability in those actual real-world cases. Most times it would.

Most notably, for overwrites happening within the local stack frame, you completely remove the possibility of overwriting the return address. This is a fundamental difference in the level of vulnerability of that kind of code.

It's called "reducing the attack surface". Well-known idea.

Maybe my post was a bit hyperbolic, but I chalk that up to being so annoyed at this.


context

There absolutely are some silver bullets.

ALL of these buffer overflows in C happen because the C stack grows backward, and old space is after new space.

If anyone have enough of a crap just to standardize a calling convention that went the other way, stack buffer overwrites would ALWAYS go into unused memory. Then security-minded people would switch to this calling convention for secure programs, and many problems would be solved.

(Of course heap problems would still exist but they are much harder to exploit and it is easy to make an allocator that tries to confound heap attacks.)


context

On the contrary, this is the best way to make code reusable, because it is the only way that allows a C or C++ program to add more code solely via the compiler, without also requiring modification of external build scripts / tools / etc.

When you are shipping and maintaining code on 5 or 10 different platforms, this really matters, because the friction of adding new files becomes huge ... you have to go add that file to 5 or 10 different fuckity fuck build systems that are all uniquely terrible, and hey maybe Apple updated XCode to whatever the new lousy version is instead of the old lousy version, so you have to go through the rigamarole of installing that, which of course won't completely work, oh and the internet is slow today, and on some console platform that shall not be named our dev software didn't auto-renew its license and that is mysteriously timing out so now we get to deal with that for hours, blah blah blah.

This is not an exaggeration. You're lucky if you actually get to do any programming on the day you decide to add a cpp file.

Experienced programmers who ship on a lot of platforms really want the simplest and most straightforward way of using code, and this is what that is for C and C++.

More modern languages could be designed to be better at this, but they usually aren't. (A 'package manager' is not really the answer, it is a solution to a kind-of orthogonal problem and usually brings in way too many of its own complexities.)


context

You might, but that costs money, and other market players are always invented to undercut you.

Because the line of how much is too much is indistinct, someone is going to guess wrong before too long. This is even before you mix in perverse organizational incentives involving short-term view or individual profit vs long-term company health.


context

Given how much we know at this point of the utter questionability of studies in the social and psychological sciences, I don't think it's at all productive to make this kind of point.

Sad but true.


context

Kids today.

In the late 1990s-early 2000s a decent gaming PC would cost you between $2500 and $3500, and those numbers represented more money than they do today.

A "very high end computer by today's standard", when it comes to games, would have a GPU that's substantially faster than what Oculus is requiring ... I find their requirement shockingly low and wonder if that is a tactical mistake.


context

If you set a maximum wage like that, a lot of what is good in society would not happen because people would not be able to amass capital.

For example, you would not have a SpaceX or a Tesla. You would not have Y Combinator as it exists today. You would not even have the video game that I am about to release next month.

Yes, there are a lot of jerks who amass capital and do nothing with it or who do irresponsible things. But you also have people who use it to work very hard to make positive change in the world, and even if those people are in the minority, their impact is very large.


context

No, this is how I program all the time. Usually when I start typing I have only a rough idea of what I want to do.

But in practice this is not a problem. So what if I don't know exactly what type a particular thing will be in the end -- I know generally if it is a number, or an array/list, or a hash/index... that is all I need to know. I use one of those basic types. If I need to change it later I change it later, and the fact that I am in a statically-typed language is great for changes like this because it helps me make them with high confidence.

This is why I don't believe that anyone who makes this argument in favor of dynamic languages really has that much experience in static languages. The actual outcome in real life is the opposite of what is described.


context

In a statically-typed language with reasonable tools you can right-click on an expression and see what type it is. This whole thing you're describing is a non-issue.

context

Oh, I've tried both all right!

There is more mental overhead in dynamically-typed languages, actually .. it's just less visible because it's implicit! It's the overhead of having to "keep all the type information in your head", which dynamic type proponents sometimes seem to be saying is a good thing.

It's not good because it is a tax on everything you do! Whereas in a statically-typed language, sure, you have to do the little extra overhead of putting the types in the program text, but this is quite freeing in the long term, because you can then drop the burden of having to think about what type something needs to be, in most cases.

(It also serves as documentation / literate programming.)

My approach to programming tends to involve rewriting things several times, or heavily modifying them, and as someone who has been programming for 34 years, in a lot of different situations, I find that static typechecking is by far a superior framework when refactoring or rewriting code. It is not even close.


context

"...calcify and ossify data structures and types..."

What are you even talking about? This sounds like an assertion from someone who doesn't use statically-typed languages and is just guessing.

To change a data structure in a statically-typed language, you change the declaration then fix any compile errors. It is easy in most cases.

In a dynamically-typed system, data structures actually get way more ossified, because when you change a structure you don't really know what might be broken or when you are really done making the code correct again... Therefore programmers avoid this.


context

You can apply all your same arguments to gambling, but society disagrees -- gambling is regulated.

context

S&P is measured in nominal dollars, so of course it is near its all-time high. It should almost always be at its all-time high just because of inflation. So this doesn't tell you much.

context

That is not even what I am talking about.

I know a number of people who have quit Valve and almost all of them would cite organizational dysfunction as one of the top reasons for quitting.

I don't know whether that is true -- I have never worked there -- and I don't wish to spread any ill rumors about Valve. I'm just saying that I know a bunch of people who have worked there who think the flat thing is one of Valve's biggest problems (another one being the incentive structure; of course these two things go hand in hand).


context

You are presuming it actually works for Valve, which is not an uncontroversial notion.

context

What do you mean "way to make a valid criticism"? The criticism IS valid and that is obvious. This is a tiny and light rocket, and any VTOL system that would work for a big heavy rocket would be very different from this one and a much harder thing to engineer.

You are saying SpaceX lost a 'first' here to BO but that is not really true and that's Elon's entire point. This is not the first VTOL rocket landing either, maybe it is the first rocket to officially reach space and then subsequently VTOL land but that is not as big of a 'first' as most people are thinking it is.

Which is not to diminish what BO just did, it is just to see it in an accurate context.


context

If regular gas cars are so great, why do they need government subsidies?

Hell, if gas is so great, why does it need government subsidies?

Do your research. Subsidies for traditional cars and for oil are massive and dwarf anything Tesla has ever gotten.


context

No. If a fragment of a fragmented UDP packet is dropped, the packet is dropped. It is simple. If you understand why UDP exists you will get it.

context

I suggest you educate yourself about Bell's Inequality.

context

... And those developers are generally right.

I would use SDL in Linux ports of things because it is the closest to a reasonable native API on Linux (which says more about Linux than SDL actually). But even having done so I would then use native APIs in Windows, OSX, etc.

If your standard of quality is high enough, it won't really be possible to reach it using a blanket API like SDL everywhere.


context

There are many different kinds of prayer in the (for example) Christian tradition. You might be thinking of intercessionary prayer, but there is also contemplative prayer which is indeed a lot more like meditation.

context

I came here to post "this article is garbage" but I see I have been beaten to it 3 times by the only 3 other comments here. So.

context

All other arguments aside ... this idea also fails if the judging party's idea of quality is mostly uncorrelated with actual quality. Which Graham says in other essays is usually the case (it's what you mean when you say it's almost impossible to predict which companies will be successful).

Graham says the subjects of bias "have to be better to get selected", but what is really going on is they have to be better according to the metrics of the judge which are essentially arbitrary.


context

Only a billion? This is the USA in 2015 we are talking about. A single F-35C costs a third of a billion dollars. So you are talking about 3 planes.

context

No kidding. I was trying to watch some soccer games in England last month, and it was an exercise in having to fight through the flashing+animated ads around the entire field. I don't think I will ever try watching one of those matches again.

context

Part of the reason this is unappealing to consumers is there are actually two layers of bundling here.

One layer is the obvious cable TV bundling that most of us probably think is evil and should die.

But the second layer of bundling is ESPN itself. How many people care about "sports" in general, enough to pay for all the sports? No, people usually are into a couple of sports at most. They like baseball, or they like basketball and football, or they like the other football, etc. Or even, they like specific players.

I think there is a future to be had in selective channels available on the internet that cover specific sports in much greater detail than ESPN ever would.


context

Don't presume to give them so much benefit of the doubt. Apparently Stripe are kind of jokers.

This incident report reminded me of the 'Game Day Exercise' post from 2014:

https://stripe.com/blog/game-day-exercises-at-stripe

in which one robustness check that should be a continuous-integration kind of test, or at least a daily test of a normally working system, is such a big deal to them that they make a big 'Game Day' about it, and serious problems result from this one simple test.

After they have lots of paying customers, of course.

I know we are supposed to be positive and supportive on HN but this was a red flag that the entire department has no idea what an actual robust system looks like and were so far away from that, after having built a substantial amount of software, that expecting them to ever get there may be wishful thinking.

So I am completely unsurprised that they are having this kind of problem. The post-mortem reveals problems that could only occur in systems designed by people who do not think carefully about robustness ... which is consistent with the 2014 post. It kind of shocks me that anyone lets Stripe have anything to do with money.


context

No trackpad buttons, no sale.

context

No kidding. If you want an image format to become widely adopted and standardized, GPLing the code is a pretty bad idea.

context

I won't ever hire anyone who has this company on their resume.

context

Yeah, my comment was not in response to your post, it was to the comment I was replying to.

context

Yes, indeed that was a big part of my point!

But also, even if I didn't have a choice and had to write the code in C++, I would do it in the style on the slide, not the style endorsed by the link on this HN thread.

I do agree that copying a line and changing spots is a common mistake pattern. However, I also still think that in many cases that is the best thing to do because it results in the simplest code. So rather than go through contortions in the actual code to try and prevent this, I am wondering if some kind of IDE pattern matching is a better way to catch this class of errors.


context

Guys, there are errors in this code because I just typed it into a text buffer without ever compiling or running it. It is an example from a slide that I didn't care very much about, for illustration purposes only.

If you did that with the C++ code in this article there would be errors too. Major straw man.


context

I agree! But the first step in a good decision is a clear understanding of the situation, and it is a barrier to understanding to just blanket-decide that procrastination is completely bad without even considering otherwise.

context

This is one of those cases where science, viewed globally, is being dumb.

Is it not consistent with the scientific narrative that procrastination, being a universal behavior, must have been developed evolutionarily for some benefit? It is a pretty sophisticated behavior after all (as the article even describes). So isn't it naive to assume it is a problem to be solved? Maybe it is, maybe it isn't, but shouldn't the early work be in trying to understand the full effects of procrastination on lifestyle and future fitness so that we actually get to a place where we can make judgements about it?

TL;DR: These guys are totally amateur hour.


context

I don't see how you can claim that the idea of wave function collapse here is a misunderstanding.

The mathematical meaning of the statement "two things are entangled" is that you cannot factor the states into the product of two separate quantities. So there is only one equation for the two particles. So once you "collapse" this for one particle (whether or not collapse is a physical action) it is by necessity collapsed for the other particle, because there is no other state that could not be collapsed.


context

With this scheme you would end up with pretty bad problems regarding function pointers and lambdas. Because the type is deeply implicit, you would have two function pointers that look compatible but are totally incompatible. Then when you want to assign them to a variable -- how do you know a priori what type to declare the variable?

context

Because in order to know which instructions in your procedure might excecute, you need to know about the exception-handling behavior for everyone you call, which requires looking at the source code of everyone you call. User barkkel above in this thread was saying that passing around full return-value information for something like a file-open operation is somehow a violation of abstraction (this idea does not make sense to me) ... But I can't think of a bigger violation of abstraction than requiring you to know everything about everyone you ever call. Of course in reality people don't do this, which is then why programs that use exceptions have so many problems.

And if you say "why would you let exceptions bubble up that much", well, that is the whole point of exceptions, that they bubble up. If you say "to get rid of nonlocality just catch outside every call", well, now that's equivalent to checking return values always, but more error-prone.


context

I also prefer option 3 over option 4. It bothers me that the article recommends exceptions without seeming to understand their drawbacks (the major one: as soon as you use exceptions you suddenly need to apply nonlocal reasoning everywhere in order to understand what your program will do at any point. They turn your program from a simple local thing into a complex nonlocal thing, which is not a good idea if you want to understand it well.)

This article seems not to have a lot in the way of new contribution; it is just parroting the oft-repeated idea that you need exceptions to pass "error information" up several layers of abstraction. But here is what I think about this:

1) In this age when we are realizing strong typing is a good idea, that hidden state is a bad idea, and that in general you should be very specific about what is going on, why are we even conceptualizing this as "error information"? Why instead, when we try to open a file, do we not return "all information you might need to know in the case of opening a file" (which includes what happened if it didn't open properly). As soon as you make that conceptual switch, all this hand-wringing goes away. It's a non-problem. You certainly shouldn't add heinous complications to your program to solve this non-problem.

1a) This conceptual change also helps disambiguate between what the article calls "hard errors" and "soft errors". In portions of the code where you have attempted an operation that might have failed, and you are not completely sure that it didn't fail, you have the full body of "what happened" information (it is a small struct or whatever). After the situation has been checked and you know it is exactly what you need to be, you may drop the other information and pass the raw file handle. At this point it is clear that these parts of the code should only be executed if the file handle is valid, and if that is not true, the programmer made an error. This is analogous to the situation with nullable and non-nullable pointers (in some languages you would even use the same mechanisms to deal with null pointers and invalid file handles, etc, but I am not sure this is really helpful.)

2) If one insists on not making the simplifying leap from (1), well, maybe the other problem is that you have so many layers. If you didn't have so much glue, your code would be simpler and easier to deal with, and it would run faster, and you wouldn't be so worried about needing to pass lots of context information up several layers between modules, because those situations don't really arise.


context

I tried using LibreOffice last year and it was a total joke. Almost nothing worked the way it was supposed to, and it crashed often. When it came time to give my speech and output slides to a projector, that just didn't work at all. My presentation had to be saved by someone letting me use their Mac. It was the worst software experience I have ever had, and I've had a lot of bad software experiences in my 43 years.

It's so bad that I believe it's unethical to offer it for download as working software because people with work to do on short timescales (as I had) may choose to rely on it and then get screwed.