> It's kind of a throwaway passage in the article but it feels like a crucial point. Maybe I'm wrong but "buying off" to me strongly suggests something along the lines of a bribe. We're going to give you this cash, nudge nudge, wink wink, maybe that negative stuff you print about us goes away. My question is: is this actually happening?
My question is: Do you really need to ask that? Human nature is human nature; there's little doubt that it's happening to some extent. Virtually impossible that it's not happening at all.
The problem to me seems quite similar to campaign contributions in politics. Yes, it's indirect, and all parties can say it's innocent, no strings attached, the elected representative is not "required" to do anything. Those are words that help people ignore the reality of human nature. Money is influence.
This is why the business side and news side of (quality) news papers have been historically kept separate. It also leads to perceived "weirdness" to readers when they see a paper taking money from a corporation to run an advertisement alongside an article critical of said corporation.
This isn't to invalidate your point completely, but to give some insight to those who are not aware of how this conflict of interest has been generally handled.
Yes, of course, I agree. And it doesn't invalidate my point at all. I would say that you're pointing out the problem that people don't understand how important it is to avoid conflicts of interest, and that we need systems in place to minimize it.
The case of Donald Trump is a perfect example. He had conflicts of interest galore, didn't even deny them, in fact he flouted them, and many people seemed to think everything was fine.
We need to be aware not only of the existence of conflicts of interest, but also of (1) how corrosive these conflicts are, and (2) the crucial importance of designing systems to minimize the bad incentives and effects that arise from conflicts of interest. Systems, in other words, that minimize the harm that -- because of human nature -- inevitably flows even from well-intentioned people when conflicts of interest are tolerated.
Yes, you really need to ask that. When accusing someone of misconduct you really do need to ask if it is really happening and not just hand wave about human nature and make judgements based on untested assumptions. You really really can’t just go along with whatever fits your biases.
Well, I would say that if you want to accuse them of misconduct then the main thing would be to ascertain intentions. Certainly, if there is no bad intention on the part of the actor, then it makes no sense to accuse them of misconduct. So if you want to accuse them of misconduct, definitely, yes, you need to ask questions about their intent.
On the other hand, regardless of intent, it does still make sense to say that the system is maladjusted, that the system is designed in such a way that even actors with good intentions are incentivized to do harmful things. And the question regarding the system really is clear: Does a recipient accepting money from an advertiser tend to create in them a more positive opinion regarding the advertiser than if the recipient received no money at all? To that, the answer is yes. It doesn't mean the recipient has committed "misconduct"; more likely it's just that the recipient is a well-intentioned actor in a maladjusted system.
Yes, we've got to make sure we're talking about the same thing here. In general, we don't morally censure a person who does bad things with good intentions. For example, in criminal law there is usually a requirement of ill-intent, called "mens rea"[1], which means the actor is aware that what they're doing is wrong.
If a person does something wrong but is not aware that what they did is wrong, the general reaction is not to "accuse them of misconduct", but rather to explain to them what it is about their action that's wrong, after which (if they in fact have good intent) they will no longer do that action.
Mens rea is subtly different, I think. It is intended to allow a defense that you really didn't want to break the law and did so unintentionally. It isn't quite about the self-perceived goodness of your intentions. That is, stating "I knew I was breaking the law but my actions were good and I didn't realise people would think I'd done something bad, therefore I don't have mens rea" won't work. Knowledge that it was illegal was sufficient. And mens rea is also not a defense that allows legal ignorance.
It's also worth noting that quite a lot of broad crimes are strict liability these days, especially in America. For example money laundering is a strict liability crime, along with more obvious ones like speeding.
Misconduct though is not normally a criminal law term anyway. More like a code of conduct for an organisation.
Not really. I'm a lawyer so I have a decent grasp of this stuff. Mens rea is closely tied to intent, not really to whether it had anything to do with intending to break a law (ignorance of the law is generally not a defense). Think of the difference between someone who commits premeditated murder, someone who commits manslaughter through gross negligence (e.g., a drunk driver), and someone who by complete accident is responsible for another's death (someone runs out in front of your car). In each case a person's actions directly cause another person's death, however we attach different levels of culpability to each. We're getting a little bit far afield from original issue, here, though. Anyway . . .
This permissive admitted towards public corruption is a fairly new development in US politics.
Jimmy Carter put his struggling peanut farm in a blind trust to avoid the appearance of corruption. This has become a punchline today, but was standard at the time.
Frankly, the corruption of the Clintons, who generally took turns raking in cash while the other performed duties as a public servant, sent the dominos crashing here.
Because corruption is incredibly easy to hide and obscure, it is correct to cultivate a culture of transparency accountability when, for example, tens of millions of dollars change hands between powerful institutions for no apparent reason.
Yep, Vim is the first thing to come to mind. I'm not sure there's any program with better undo-ability.
I also am not quite sure what the post above means by "GUIs support undo". Yes, most apps,e g., on Windows, support CTRL-Z for undo. But this functionality depends on each app implementing its own undo, nothing OS-wide, and results will vary. I'm not sure what there is in GUIs that supports "undo" other than an app/user convention like CTRL-Z, which hardly seems to rise to the level of "supports undo".
"I've been programming for a long, long time. Went to college for it when I was a pre-teen."
Yet you don't tell us how old you are, so it's hard to get a grasp for how long you've been programming. As someone who first started programming in 1979, I notice that there are plenty of people who say they've "been programming a long time" who weren't even born when I started programming. This isn't a criticism, just a comment that we don't really have any idea how long you've been programming. I can picture a 29-year old saying the same thing you did. Time is relative, something that becomes more and more clear, I think, the older a person gets.
> As someone who first started programming in 1979
I absolutely love that there are seasoned experts like you on HN.
Would you mind sharing with us what you're working on these days?
Compared to you I'm relatively young (I've been programming around 26 years) and in my circles I don't get to mix with too many older programmers.
Would love to hear your thoughts on career trajectories.
For me personally, I'm gradually spending less time on pursuing commercial interests, and more time on pro bono projects - and I love the idea of working on open source software indefinitely once I retire.
Same here, 31 years. I'm 43 and started in around 1989-1990 at age 12. I didn't know anyone who had ever even attempted to program a computer, so I was completely on my own. I have a good grasp now but remain humbled by the craft (coding every day). Nearly everyone I know now is either where I was twenty years ago or insanely elitist.
Still on my own.
Another relatively young programmer here: born in 80 started but started programming somewhere between 92 and 94 on Commodore 64.
I've been extremely lucky to mostly have worked with mostly ordinary programmers with some extremely good[1] thrown in and luckily even with the brilliant ones all except two of them were also down to earth and nice as well.
[1]: "master of all trades", all-knowing teacher types with "saintly" patience, learns-anything-in-two-hours-and-proceeds-to-fix-hard-bugs-after-lunch
I am 52, I started in BASIC programming in 1979. I am currently disabled and unable to work. I am trying to get better so I can become healthier and get back to work.
Programming became a lot easier when Visual BASIC and Delphi came out. Just drag and drop controls.
Due to ageism I am sure I don't fit the culture of a startup or relate to 20 somethings. They hire them young anyway not old. So I do tech support for family and friends to get by.
And can slightly change how one does and says things, which others can notice, -- to some small extent, can become a self fulfilling prophecy.
(And this can work on in a good way too -- if you say to yourself maybe: I like these people, I like most people, what matters is not age, but if the others are curious and want to learn new things)
Now, of course I do believe that ageism is a thing, still, I'd think there're somewhat many good workplaces that aren't much affected by it
While there is a degree of ageism in the industry - avoid companies that advertise their ethos as "work hard and play hard" because having to do most of your office politics half-drunk in the bars after work is not much fun - there are also a lot of people in the industry who genuinely care more about a person's ability to learn and adapt than the date on a birth certificate. Believe in your abilities. Wishing you good health for the future!
Thank you, I learned 27 different languages since 1979, most are so old that there are no jobs for them anymore. I used to be a master at Visual BASIC until Dotnet came out.
I can learn any language on the market if I wanted to. I am a quick learner as I have the theories of computer science in my head as I learn.
My first "paid" gig was winning £50 for submitting a game written in AMOS Basic to an Amiga computer magazine, which got picked as the magazine's "Game of the month". It's still an achievement I'm really proud about.
I’m 53 and started FORTRAN programming in 1975ish on the VAX at my moms work. Bought an Ohio Scientific C2-8P a year or two later with my brother, and that’s when I got into programming games and really started to learn (BASIC) fast foreword 44 years or so, and I’m working in solidity writing contract code in a blockchain startup with a bunch of early 20s guys. They call me dad and are always asking advice on architecture and data structure problems Just closed our first round.
Wow, that's an even longer time. I started tinkering with code in the late 80s when I got my first computer at 4, then went to college when I was 11. First job at 14. I feel like 30 years of coding is long enough to feel like a long, long time. My pops had punchcards at home from the good ol' days.
I wasn't part of the couple waves of programmers, but I think it is fair to say I was in pretty early. Retying out programs from magazines isn't even something most programmers have considered these days, let alone programming without the internet.
But the essence of your comment is right. Of course there would be people out there that have programmed for twice as long as I have. That's a little frightening to think of.
> Retyping out programs from magazines isn't even something most programmers have considered these days
Oof, this takes me back to the day I learned about RAM the hard way. I was typing out a program from a magazine. It seemed like it took forever, even then. About halfway through the computer rudely informed me that 4K of RAM is not, in fact, enough for everyone.
Ha. Soviet magazines were more considering. They listed memory requirements well in advance, when I was learning racing games for my programmable calculator in late 1980s.
I learned originally on a VIC-20 by typing in games out of books from the public library. At some point we upgraded to an XT and a friend sold me a copy of Power C for $20. It came with a beautiful hard copy library reference and the rest, as they say, is history!
Power C! I grew up in Germany and after the inevitable BASIC, C was the second language I learned, using Power C as a compiler, which I ordered by mail and which arrived from the States several weeks later, including the hard copy reference manual you mention. I also remember it came with a rudimentary graphics library I used to create screen savers for friends. Good times.
Still costs the same now as it did when I bought it!
Yeah, the graphics library was great! When I moved on to Linux and gcc, I was disappointed for a while that I didn't have all those super simple primitives to work with.
> Retyping out programs from magazines isn't even something most programmers have considered these days
I'm still retyping stuff from stack overflow instead of copying. I find it really effective to really think through the code you're borrowing from somewhere — because once it's committed under your name, you're the one responsible for it.
The last thing I think I got from SO was an implementation of the Boyer-Moore Algorithm for a byte searcher. I think retyping it would have probably introduced bugs: and as it worked on a test case I had to hand and could verify (finding the data header size in a WAV file by looking for `data` followed by the SubChunk2Size bits, which I could verify with `afinfo`) I was happy to use it rather than learn how the algorithm worked.
As Morpheus said, “Time is always against us”.. so I just made sure it followed our coding standards, checked the test cases, and moved on.
But, I am old enough to remember code listings in magazines. I like to think the typesetters introduced deliberate mistakes because they hated the work so much - not to disrespect the fine profession of typesetters, but when you set your 100th `Poke` command in a row, you might think this isn’t what you signed up for..
> I'm still retyping stuff from stack overflow instead of copying. I find it really effective to really think through the code you're borrowing from somewhere
Numerical Recipes in C. Had the hard copy but not the disk.
Often the code was just to obscure to work out as you typed, but there was some real value to typing it in. You got some real feel for it. Additionally it offered hard lessons in writing test cases.
Please don't call me that. I really believe any 11 year old could learn what I did with parents like mine.
When I compare the skill involved with something like violin versus the skill that's needed to validate HTML forms with JavaScript or creating an application in Visual Basic, I really do not believe that people that happened to study software at a young age happen to be geniuses just because they did. Yes I'm smart, but I really believe that this path could be open to anyone that age if they have the interest and I think the internet has unlocked many people that have learned the same skills without the credentials.
I used to have this mindset, but then I had a small stroke and lost significant IQ points. I slowly, but never fully, gained them back over two years. I realized that the ease that I saw/see solutions, compared to others, wasn't just related to the time I put into thinking about them. Much of it came for free, in what I can describe as the length and number of the tendrils reaching out to explore whatever "problem space".
I no longer believe that "anyone with an interest" can be at the same level as someone that can just see the answers, with little effort. Some people have fewer/shorter tendrils.
This has definitely changed the way I interact with people. I used to get frustrated when people, who I thought should be able to understand, couldn't. Now I realize that they just can't as easily. They need that picture drawn out for them, and even then, they'll never see the nuances or perceive the textures of the problem, unless you point it out to them.
I think I'm lucky for being born with the mind that I have. It has made my life easy, pulling me out of poverty, with a mostly addictive enjoyment in what I do. I think you're probably luckier than you realize.
The way I talk about this is to frame what people call intelligence as the combination of memory (+ actual memories) and comprehension.
Your ability to 'just see the answers', in this framing, stems from having a lot of data points readily available and the ability to combine them together quickly.
There are definitely people who are better at remembering things, and piecing multiple ideas together quickly, but these are also skills that can be trained. I think it's likely that a lot of 'intelligent' people are simply people who actively (though usually not consciously) train these skills because they enjoy them.
In the same way that many fit people don't have to think about exercising - they do it because they enjoy it or without any particular goal - there are people who see an interesting problem and immediately start thinking about how they might solve it or how it's similar to other problems they've seen.
In the same way that anyone can implement a training regime to improve their fitness I think anyone can implement a training regime to improve the number of data points available to them (read lots!) and their ability to combine that information together (solve puzzles, especially theoretical/not personally applicable ones like "how would I get that boat free?").
You find it odd that people think about what intelligence is, or how it presents?
If you’ve got links/references/keywords for research that invalidates (or validates!) these ideas please share them, I’d love to look them up. From what I’ve read the idea “intelligence can be (at least in part) described as having knowledge and being able to apply that knowledge to new problems” is a well trodden one.
I haven’t seen much on the idea that some people may be predisposed to engaging with stimulating situations, so anything you have on that topic would be highly appreciated. I have seen writings on how a stimulating environment is important, and on how encouraging engagement can be effective (for example asking questions of children and allowing them to answer, vs answering for them).
[edit]
In my original post I should probably have written “is to frame a lot of what people call intelligence as” - I definitely don’t think this is all intelligence but I do think it has a significant role in what the gp was talking about, this ability to see answers quickly.
> [edit] In my original post I should probably have written “is to frame a lot of what people call intelligence as” ...
Ok then i understand better what you mean.
Actually there's a word for that type of intelligence:
Crystallized intelligence.
And I had another type of intelligence in mind:
Fluid intelligence.
I think we spoke past each other (or I spoke past you) thinking about different things.
Anyway, one of those, one can improve eg by reading, getting life experience. But the other one, is fixed (from what I've read) once one is grown up.
If you want to, you could websearch for those words. And also, wikipedia has a section about intelligence and inheritance (hint: life is unfair).
> I have seen writings on how a stimulating environment is important, and on how encouraging engagement can be effective (for example asking questions of children and allowing them to answer, vs answering for them).
That sounds great :-)
From what I've read, those things do work (!), when one is a kid / young. And from what I've read, it also prevents the brain from deteriorating, when one is old (using one's brain reduces the risk for dementia).
> everyone seemed to think I was smarter than I believed I was. I feared I might fail miserably and finally prove how wrong they were about me
I can totally relate to that. Once everybody told you you're a genius, the pressure not to fail is incredible.
I started programming at 8. I got next to no help from my parents or my teachers, until the time I entered college, and by that point I felt I knew as much as the professors, sometimes more. I always avoided talking about programming, since that would get a me more genius calls on top of what my grades got me. And it doesn't help with making friends. Over the years I had maybe one or two friends who knew about it. Few would've believe me if I had told them what I could do.
I feel like there's nothing special about the path I took. I feel like anyone would be able to achieve the same knowledge I did given enough work and support. I must have spent thousands of hours programming in my teens. What nobody seem to realize is that the genius label is wrong, what they really should have told me was that I was "passionate". Anyone who is passionate enough can become a master.
> I must have spent thousands of hours programming in my teens.
That's what I think about when I hear people complaining about gatekeeping in our field. The books are open, the courses are there, interesting and useful applications abound. Given the same level of effort I think much of the difference between social groups would vanish.
I don't know what the university courses entailed. I'm basing "genius" on my knowledge of the current Computer Science curriculum. If you were doing CS courses at age 11 I do think you must have genius level intelligence.
If it was more practically-orientated, then I agree with you :).
I'm not really sure what to work on next. I was focussed on arms control for cyberweapons for a while, and I made some real progress, but I want to work on something new now. Maybe finding a way to scale up good things like trust or good will? I want to find something where I'm making the world a better place but also working on something that makes me smile. Trying to fight weapon dispersion is exhausting and discouraging and, ultimately, as I learned, futile.
> Maybe finding a way to scale up good things like trust or good will? I want to find something where I'm making the world a better place but also working on something that makes me smile.
Have you made any progress finding something new to work on?
Maybe I'm projecting here, but I'd imagine this is the dream of most of HN, no? But I don't know which is harder: finding such a unicorn idea, or executing on it once you've found it.
> Maybe finding a way to scale up good things like trust or good will?
Have you considered working in the cryptocurrency space next? I think that would satisfy your desire to find ways to scale up trust. One of the key value propositions of crypto is building trust at scale on the pillars decentralization, cryptography, game theory, and economics.
> Retying out programs from magazines isn't even something most programmers have considered these days, let alone programming without the internet.
Wow. Flashback. I started programming late age wise (college freshman in the 90s), because until that first student loan we didn’t have enough money to buy a computer. I would go to the local bookstore and copy code out of the programming magazines and books. I remember writing some c++ code, bumping into a problem I couldn’t solve and driving to the bookstore to look at the books for a solution.
Going to college for programming must mean they are quite young indeed. When did colleges even start offering programming degrees? Unless maybe this is some sort of vocational college.
Depending on what is meant by 'programming,' computer science really grew out of mathematics and occasionally physics departments at colleges in the late 50s and early 60s. Disciplines started establishing CS departments in the US around the mid- to late- 60s. I'd say anyone exposed to that period on could be considered as having formal training in "programming" in a college or university environment.
I have a good friend that worked during the 60s era programming with punch cards doing applied physics work in FORTRAN (pre 77) which was already pretty big by then. You could probably go back a bit further but I don't think much was being actively taught as a sort of course one might expect today then. So I'd say you could have at most 65ish years of programming since formal use in college.
No, it was not software engineering at the time per se but I'd absolutely call it programming.
Electrical Engineering departments were also starting to offer more and more "programming and computer" related courses as well. At a lot of universities these eventually branched out to become Computer or Software Engineering programs.
My uncle worked at the National Institutes of Health for about 35 years doing bioinformatics and protein structure modeling. In FORTRAN. Always in FORTRAN. He knew other stuff, but the bulk of his work was always FORTRAN.
I didn't learn fortran in university, but I did have to learn it for a job. It's still THE language for scientific computing. So in any job in or adjacent to scientific computing, you're bound to run into fortran.
My mom taught programming at the college level in the early 80s, in a computer science department at a state university. At that time, the big universities had computer science departments. The 4 year colleges were more of a hodgepodge, ranging from full blown CS, to a handful of programming courses offered by the math department. By the time I graduated in the mid 80s, CS departments were pretty widespread.
Someone who entered college at the start of the dot-com bubble is in their mid 40s, and that's not even when CS degrees were first offered, just when they started entering the public consciousness.
You could easily have a CS degree and be past normal retirement age.
CS College degrees in dot com era were hit and miss. Also often had a weird mix of electrical engineering courses thrown in.
Was quite common to do 2-3 years then drop out and start a job. So much so that people with degrees were often looked down on. Exceptions for things like MIT.
After year 3, there was nothing left for me to take. So I took a job.
I would like to have a degree, but it was the right call at the time.
Few years ago I went back and started an Art degree. Was a blast.
I'm still holding onto some of my mom's CompSci homework from the early 80s or so. Mostly based on flow diagrams and what amounts to state machines. Sadly no punch cards, though she talked about taking them in to run assignments.
Story goes that she had a campus job cleaning, and made good friends with the guys in charge of running the mainframe by bringing them food and drinks when she stopped by. Which of course meant she could often get them to sneak her stack of cards into the queue overnight.
Details are fuzzy since I last heard the tales over a decade ago, and haven't dug the assignments out in forever.
NC State was celebrating 40 years of CS in 2006 or so and they weren't the first. CS degrees have been around since the 1960s, so 50+ years at this point of CS as a separate degree program in the US. Apparently Cambridge offered their first CS degree in 1953.
Yes, but they took quite some time to spread to the whole country and to every major university. That didn't really start until the 70s and 80s (in line with when NC State got established).
I switched my major from 'business' (yawn) to comp sci 35 years ago (1986).. at my school, it was previously a 'concentration' in the Math department, which meant you got a BS in Math, concentration in Computer Science. It wasn't a full Bachelors of Science degree until about two years before I got there.
I think you are kidding. Even my southern noname university had cs degrees 40 years ago. We had 3, one that was computer engineering, one in arts & sciences, one in the biz college
Started on a pdp-8/E in 1973 with its staggering 4k of 12-bit memory. "Going to college to program" might have meant going to where the machines were as most of them were on the large size... We had to descend on the college computer centre, which drove the adults nuts. Rugrats running about, fixing their programs for them. Times were different then, but the graduate-to-hacker test was to start with blank paper and end up with a working program. It was too big and too slow, but at the end the new hacker was enlightened.
They probably meant they went to college for something like computer science or computer engineering that includes a ton of classes that involve a lot of programming?
Or, since he says "pre-teen", it could just be that his parents sent him to a programming class at a local community college when he was twelve years old. That's the sense I get, actually. The number of people who start actual college as a pre-teen is vanishingly small.
We had a Pascal class on a VAX in my high school in 1983, and it was fantastic. We used the classic "Oh! Pascal!" text and it was great preparation for college. There was no CS major at the liberal arts school where I ended up, but there were courses with Turbo Pascal taught by the math department before I graduated.
I first programmed a calculator (well, copy-keyed a hangman game program into a calculator) back in 1980. Passed my 'O' level computing exam in 1983. Started building websites in the last years of the 20th century. Looking back, none of that feels like "programming" to me. I finally started programming when I had a mind-blowing "A-HA!" moment about what Object Oriented programming was all about in 2009 - which made a change from the many, many "wtf" moments I had with non-Basic-like languages before then.
Also not directed towards you, but just because someone has programmed for a long time doesn't been they're particularly good at it. Plenty of people don't learn - it's not even a character flaw, some people just program for the job.
Yes, you are correct. The people who say the "piano" comment was meant just as an observation, not as criticism are wrong. They're ignoring that this was made as an "observation" specifically about free software, with the obvious implication that this is a key difference between "free" software and commercial software. That means: this was not _merely_ an "observation"; it was an observation intended to be a criticism of free software. This is just basic reading and interpretation skills. (Could this interpretation of the comment as a "criticism" be wrong? Of course, but I think that's highly unlikely.)
Yes. Also, for a person with the requisite piano tuning/repair skills, a free piano can be a great thing. And this person can identify which free pianos are great things and which are likely to be unsalvageable. For someone who has no clue about pianos, or on how to work on pianos, not so much.
A lot of open source software is intended to be used by developers, or, in other words, by someone who knows what they're doing. Not all, of course. There's plenty of open source stuff out there targeted at end users, or at admin-types, non-developers. Much of it is excellent. (In fact, much of it "runs the internet".) Much is not, or is really in a semi-developed state where it needs to be improved by actual developers.
None of this is anything new. You need to be aware of how "ready for use" the open source project you choose is. By the very nature of how open source works, projects are available "in the wild" when they aren't really ready for general use.
I grew up in an economy where it was. When I was born, we were living in a 4-people, 3-generational, ~500sqft 1.5 room flat (the large kitchen was subdivided to make a tiny bedroom); later, the govt decided that we deserve better and I ended up growing up in a 4-people, 2-generational, ~650sqft 2-room flat. I wonder what my parents, two engineers, would have been able to afford in the USA?
I think I had a very happy childhood but comparing these things kinda allows you to reflect. Interestingly, the low standards stick; when we bought a small 1100sqft house with my wife (in the USA), some of my friends back in Russia were like "oh, a big house, are you planning for kids?" ;)
I lived in a country where housing (at the time) was provided by the government. It's not as great as it might sound. Want to move? Do the paper work and wait for a few years. Want a bigger place? too bad. Not to mention single room for the whole family, shared kitchen and washrooms.
The quality of the housing is going to be proportional to the wealth of the country per capita. In the US, we simply allocate quality of housing based on ability to pay, so many people have incredibly bad housing conditions.
Right, of course the only choices we have are government provides housing or everyone sleeps on the street. Not like we have any examples of any other system.
Our highways are free because they're a public good (in the economics sense) and prone to monopoly pricing due to geographic constraints on competition.
While it isn't inconceivable that housing could be public, the same underlying economic rationale isn't there, and private housing works really well. You'd be solving a non-problem (or, to the extent that there is a problem, it's one that's easily addressed by simply increasing private housing stock) and risking a lot to do so.
> Our highways are free because they're a public good (in the economics sense) and prone to monopoly pricing due to geographic constraints on competition.
And homes are not? Certainly not at the same scale, but we are seeing the same problems with landlords that we see with monopolists.
Landlords are able to charge extremely high rents and there is not enough available/affordable land to build competition, especially in cities experiencing NIMBYism and gentrification.
> You'd be solving a non-problem (or, to the extent that there is a problem, it's one that's easily addressed by simply increasing private housing stock)
It's clearly a problem. That's why we're here talking about it in the first place. It's also clearly not "easily addressed", or that would have happened already. Sure, we need to remove barriers to increasing housing stock, but that isn't likely to be enough; especially in the short term.
Homes aren't a natural monopoly. Highways largely are.
"there is not enough available/affordable land to build competition"
There definitely is in the large majority of places. The only city that can possibly say that honestly is Hong Kong, but even there they could go a bit more vertical and more dense if they needed.
"It's also clearly not "easily addressed","
Whatever lobbying hurdles you need to overcome to reduce regulatory interference on increasing the housing stock, you're going to face those same hurdles (and then some) if we're talking about public housing. So it doesn't make sense to immediately go for the radical and untested solution when an easier and proven solution is waiting. If we build vertically and it doesn't work (which it will, but nevertheless) - only then does it make sense to consider something more radical and more difficult to push through.
Yes, exactly. I would add, though, that the "Wuhan lab leak theory", in many people's minds, seems to be combined with a suspected intentional act by the Chinese government, to unleash a dangerous virus on the rest of the world. I think we should dismiss that part of the theory (it's of course not impossible, just not likely, mostly fun fodder for conspiracy nuts). But as far as infectious disease labs all across the world being dangerous places that need strict safety and security measures, duh, yes.
Let's also add in that the "Wuhan lab leak theory", combined with the "China virus" nomenclature, has resulted in huge increases in racist and white supremacist violence against people of Asian descent.
There were 49 incidents of anti-Asian hate crime in 2019 and 122 in 2020. I get that those numbers should be zero. But am I way out of line in saying that something that affects ~.00006% of Asian-Americans, and makes up ~1.6% of total hate crimes, should have no bearing on how we approach this subject?
My genuine apologies if I am crossing a line. I know this is a potentially touchy subject. Hate crime is serious and has many negative externalities that other crimes and accidents don't carry. They have also been on the rise, and could continue to grow more significant. It just feels very strange to me that 70 additional crimes in a year that saw thousands of additional murders has been such a common talking point for months now.
The 9/11 attacks killed less than 3,000 people. Or if you want percentages, resulted in the deaths of about 0.0009% of the U.S. populace. Yet it sent our country to war and has had an impact on millions of people. It is in the very nature of terrorist acts that they "terrorize" the wide populace, while only a tiny fraction are ever victims of terrorism. It is similar with hate crimes.
Human psychology deals with numbers strangely. There are many who seem to think 500,000+ deaths (many preventable) from Covid are not something to be overly concerned about. Some of these same people are deeply worried about "Extremist Muslim terrorism" that has had very few victims.
So, yeah, from what I understand about growing anti-Asian crime, I do think it makes sense to be concerned. In particular, because this increase seems to be a (predictable) response to actions by many over the past year to demonize China, which any sane person knew would create a generalized animosity toward Asian-Americans. It's not like things like this have never happened before. They have, and they're quite predictable.
I'd like to first take an aside and apologize for a previous error. I divided incidents by population and came to 0.0006. This is off by an order of magnitude. But that is not the worst of it. This number belongs in the context of crime rates. Gallup [1] tells me that 1-3% of people are victims of violent crimes. So I must further multiply by 100 and conclude that hate crimes represent 0.6% of the total violent crimes experienced by Asian-Americans. And once again, I have to add that I did find numbers that suggested Asian-Americans may be victimized much less than the general population, although these numbers were from 2006. 0.6-6% is the final answer. This is a massive misrepresentation, and I want to be clear that this was not intentionally manipulative. It was quick thinking and poor judgment.
I agree that the US response to the threat of terrorism was also very much an overreaction, so at least you can say I'm consistent.
From what I understand, the total number of hate crimes decreased in 2020. I haven't been able to find the data and if, for example, this is because the number of hate crimes against whites dropped, the following is false. But in my mind this fits a model where X people are going to attack minorities in a given year, and this year, for obvious and insane reasons, they typically targeted Asians.
I understand the frustration and pain and cause for pushback. I say this because the next part will come across as cold. From a utilitarian perspective, there is not any material difference between worlds where different minorities are victimized. Changing the targets doesn't solve anything.
Then the solution is to teach people not to mistake Chinese people for the CCP, not to police language. And this narrative of "white supremacist" violence is a concoction by the media. Whatever increase that can't be accounted for by increased reporting is most likely not coming from the white supremacists or even the white demographic. It's an increase in inner city tensions that have been around for decades. And the people predominantly committing these acts (and I assure you statistics point to a single demographic in particular) are probably not the type to follow Trump's speeches.
It's infuriating. I'm not sure what the solution is though. We can just not talk about the very real lab leak hypothesis because some people are dangerously unstable.
You could even go back another step and recognize that the person you're replying to has made completely unsubstantiated and fabricated claims about:
1. An increase in violence targeted specifically at Asian people that is in excess of the already-documented rise of violence in general experienced across all groups in 2020.
2. An attribution that this imaginary excess violence is "white supremacist" in nature and intent.
3. A direct causal connection between this imaginary and poorly-attributed violence stemming specifically from the origin of the virus.
It's easier to defend freedom to hypothesize when you realize that the people advocating against said freedom are, themselves, simply making shit up.
you can just call it covid-19 instead of the insert racist nickname here for it and then discuss where the origin it may be from. those two things are not mutually exclusive. hell, the spanish flu is generally not thought to be originated in spain but they were the only ones talking about because the other countries had a gag on discussing it.
perhaps. I mean having the leader of the free world spout conspiracy theories and give credence to them certainly hurts.
in the past you'd have crackpot conspiracy theorists spouting off their "knowledge" at the bar to anyone that would listen but most would shy away from the crazy person. now you have a mainstream leader saying crazy stuff and have a huge following of people spouting that off because you can get misinformation and half truths at the speed of sound. yeah some vetting of information should be there.
a lot of the "proof" I've seen have been from being ignorant of what scientific terms mean, deliberate mis/disinformation, and wholly not understanding cause and effect. the other thing that lets these propagate is the downright innumeracy of our societies.
my brother has gone down a dark path of this shit to the point that I am very disgusted by the "truthers" poisoning the minds of people. he used to be a decently intelligent man but he's gotten hit with the gish gallop of disinformation and lies.
Not following. The inspector is generally out of the picture after the inspection, however it goes. If they identify problems, specialized contractors (e.g., plumber, electrician, carpenter) will be contacted to get more detail on the problem. Besides which, inspectors rely on word-of-mouth referrals from real estate agents even if they're always employed directly by buyers. If an inspector frequently blows up deals with overly critical inspections, their referrals will dry up.
> "General rule of thumb: rent if strongly believe you'll be in a place for less than three years, and consider buying if staying longer."
This is a valid rule of thumb only because transaction costs are generally much higher when buying a home than when renting. Typically 6% of the cost is siphoned off by brokers (3% to each side) and the fees people pay to take out a mortgage loan are significant. Fees you pay to start a lease generally amount to only a small fraction of these.
It's also valid because the real-returns of investments typically exceed the gains on housing[1] and you tie up a minimum of 20% of the value of your home when you buy.
1: This gets complicated because (at least in the US) most people don't leverage their investments at a 5:1 ratio, but do leverage their home value by that much.
> real-returns of investments typically exceed the gains on housing[1]
In markets where people want to live. Owning a couple acres around the far-flung suburbs of Seattle or DC will net you some good money.
No one is buying housing in Gary, Indiana. Floods have annihilated many towns around the Mississippi River, and there is plenty of real estate in Detroit and New Orleans that sure as hell ain't gonna exceed the S&P 500 anytime soon.
I think you read my comment backwards? I was trying to say that you get more reliable returns if you invest your down payment in a diversified fund than if you use it to purchase a house.
As far as "markets where people want to live" goes: yes, in any market, you can outperform the average if you can predict the future better than other investors. Some of the places around the Mississippi River probably used to be considered good investments; now they aren't. If this whole full-time remote thing catches on, it's possible that the bay-area will drop in prices. It's also possible that only part-time remote catches on, and interviews resume in person so people want to stay in the Bay Area. Heck, it's unlikely, but possible, that Cupertino or Saratoga decides to rezone large swaths of the town for more density and the increase in supply drops prices.
A good rule of thumb for selling costs is around 10% of the selling price. And when buying, factor it around 3% just for closing costs (or more if you're paying points), but then add-in more for funding reserves.
My question is: Do you really need to ask that? Human nature is human nature; there's little doubt that it's happening to some extent. Virtually impossible that it's not happening at all.
The problem to me seems quite similar to campaign contributions in politics. Yes, it's indirect, and all parties can say it's innocent, no strings attached, the elected representative is not "required" to do anything. Those are words that help people ignore the reality of human nature. Money is influence.