Hacker Newsnew | past | comments | ask | show | jobs | submit | gaucheries's commentslogin

i know i do


While I was in undergrad, I volunteered at SPD’s warehouse in Berkeley for a summer. Although I enjoyed my time there, and I learned a lot about non-profits, publishing and more, I found it be a very strange, tense, and awkward place to work. And cliquey. I did not feel welcome, seen or accepted. Kind of like the Bay Area at large over the past decade or more (and I speak as a regional native).

An anonymous former employee supports the claim that SPD was a poor place to work, run by toxic people: https://damagedbookworker.medium.com/terrorized-by-spd-61201...


While I feel for this person, this gets very messy when people passionate about the industry agree to "unpaid internships". Yes it's not right and they should stop, but it's literally a little like letting someone screw you with no commitment, then you being upset they don't feel the same way as you do.

I know the arts are rife with this stuff, but stay away from it.


> (...) it's literally a little like letting someone screw you with no commitment, then you being upset they don't feel the same way as you do.

I strongly disagree with this take. Contributing your time to a non-profit is not "letting someone screw you with no commitment". Moreso if you're an intern in a non-profit that explicitly set forth to offer unpaid internships to anyone who applied. Volunteers are there to contribute without expecting a salary, but that is not consent for abuse. Quite the opposite.

Also, you're talking about completely distinct things. Volunteering for an organization is not a green card to be treated poorly or abused by current staff. In fact, organizations who open internships understand well how it is in their best interests to create a good environment and a good experience. They know they are opening their inner workings to the outside world and creating experiences for those whose main deliverable for the organization is not work but PR in the form of testimonies and first-hand accounts.


Nice work, but I am surprised that you would charge people money to use this, and I imagine that it will be promptly shut down by Google.


Yes, this. Also, why pay when you can do what they did and get the autocompletions for free?


SEOs and digital marketing departments needs this kind of services, especially for a huge number of this queries. Getting this kind of data for “free” is not so straightforward when Google detects that.


Isn't Google more likely to detect that this service is consuming their API and then blocking them?


That's the entire value of the API. They deal with the proxies and reverse engineering for you.


I appreciate the way that duesee handled that whole issue.


I don’t think the naming itself is the point of the joke.


idk why you got downvoted. But I do too. I think it’s because of the band Joan of Arc - they use “live” in that way in a couple of their records’ titles.


I think YouTube locked down their APIs after the Cambridge Analytica scandal.


in the end, that scandal was the open web's official death sentence :(


The issue wasn't the analytics either. The issue was the engagement algorithms and lack of accountability. Those problems still exist today.


So as usual, the exploitative agents get to destroy the commons and come out on top.

We need to figure out how to target the malicious individuals and groups instead of getting creeped out by them to the point of destroying most of the so praised democratizing of computing. Between this and locking down the local desktop and mobile software and hardware, we've never got to having the promised "bicycle for the mind".


no one promised you anything


And what kind of accountability is that? An engagement algorithm is a simple thing that gives people more of what they want. It just turns out that what we want is a lot more negative than most people are willing to admit to themselves.


I would rephrase that to 'what we predictably respond to'.

You can legitimately claim that people respond in a very striking and predictable way to being set on fire, and even find ways to exploit this behavior for your benefit somehow, and it still doesn't make setting people on fire a net benefit or a service to them in any way.

Just because you can condition an intelligent organism in a certain way doesn't make that become a desirable outcome. Maybe you're identifying a doomsday switch, an exploit in the code that resists patching and bricks the machine. If you successfully do that, it's very much on you whether you make the logical leap to 'therefore we must apply this as hard as possible!'


Engagement can be quite unrelated to what people like. A well crafted troll comment will draw tons of engagement, not because people like it.


If people didn't like engaging with troll comments, they wouldn't do it. It's not required, and they aren't getting paid.


This comment has a remarkable lack of nuance in it. That isn't even remotely close to how how human motivation works. We do all kinds of things motivated by emotions that have nothing to do with "liking" it.


I don't think people "like" it as much as hate elicits a response from your brain, like it or not.

If people had perfect self-control, they wouldn't do it. IMO it's somewhat irresponsible for the algorithm makers to profit from that - it's basically selling an unregulated, heavily optimized drug. They downrank scammy content for instance, which limits its reach - why not also downrank trolling? (obviously bc the former directly impacts profits, but not the latter, but still)


This is really a child like understanding of the world.


In which ways were the Cambridge Analytica thing and the openness of Youtube APIs (or other web APIs) related? I just don't see the connection


The original open API from the Facebook was open for the benefit of the good actors to use their data. You can disagree with how it's used, but u can't disagree with the intention.

With the CA scandal, now all the big companies would lock down their app data and sell ads strictly through their limited API only, so the ads buyer would have much less control before.

It's basically saying: u cant behave with the open data. Then we will do business only


CA was about 3rd parties scraping private user data.

Companies are locking down access to public posts. This has nothing to do with CA, just with companies moving away from the open web towards vertical integration.

Companies requiring users to login to view public posts (Twitter, Instagram, Facebook, Reddit) has nothing to do with protecting user data. It's just that tech companies now want to be in control of who can view their public posts.


I'm a bit hazy on the details of the event but the spirit still applies: there were more access to the data that were not 100% profit driven. Now the it's locked down as the companies want to cover their asses and do not want another CA


Wasn't the "open" data policy used to create Clearview AI to create a profile and provide it to US govt departments?


They actually held out for a couple of years after Facebook and didn't start forcing audits and cutting quotas until 2019/2020


The TechCrunch article’s real title seems to be more descriptive: “Cruise-founder and CEO Kyle Vogt resigns”


The author might be interested to know that this page might be displaying in an undesired way on iPhone mini screens:

https://i.ibb.co/v4s0Xpj/IMG-1221.png


Eek. If anyone has a quick CSS fix for mobile I'll happily add it. I'm talking to some of my people too. Thanks for bringing it to my attention.


I'm amused by the fact that even authors of the most technically brilliant projects in recent memory also struggle with CSS, just like us mere mortals. :)

Love your work! <3


This is the only concession to mobile that I have on my website, and it seems to do okay:

    <meta name="viewport" content="width=device-width, initial-scale=1">


This behavior really should have been the default in mobile browsers. It's not like emulating a desktop viewport actually makes giant tables usable on a tiny mobile screen so it would have been better to prefer not breaking pages that can reflow for a mostly useless hack.


I don't know, I think pinch-to-zoom + pan is a pretty decent way of dealing with pages that have more complex layouts than basic document, but were designed using e.g. tables for layout. Given that most non-trivial pages would have been laid out that way back when mobile browsers were first coming out, it seems like a reasonable tradeoff to me.

That is to say, it's definitely a hack but I wouldn't call it useless, at least originally. These days it's a lot less useful since the number of pages that do benefit from this treatment has decreased dramatically. But at this point the die has been cast.


What’s stopping parents from allowing their kids to use Meta’s products?


The role of parental responsibility is a valid but separate matter.

Kids can sign up to social media from a friend's phone/ computer/ tablet, from a library computer etc., and self-certify their claimed age. They can get phones (without cellphone contract) from other people without their parents' knowledge.


It seems to me like Meta’s marketing harms everybody who uses their products… it’s not specifically kids.


ease of access.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: