I've probably given away 15-20 endgrain cutting boards over the last few Christmases to various family and friends - nothing fancy, just random pattern to use up scrap from other projects instead of tossing it in the fire pit. I know of at least 4 that have never been used for anything but serving trays because they're "special".
I find it hilarious every time - they're made to be used and abused. I've told every one of them "if you actually manage to use it enough to damage it, I'll just make you another one", but they still sit there in pristine condition.
you gave people who know you your time, your thoughtfulness, and your acquired and honed skill. those are not cutting boards - they are photographs of a moment.
anywise, the way to solve your issue is to give them a second cutting board - then the first one becomes a non-scarce commodity and both will likely get used.
That will absolutely work; I'm exactly like that - if I have one (1) of something I will over-cherish it and never use it. If I have two (or more), I will now comfortably use one knowing I have a spare.
Not logical, but I haven't broken the habit yet :)
Abundance turns people into wasteful, inconsiderate assholes. This is true on so many levels, I am struggling to understand what is your point of reference
This reminds me of how farmers in the Middle ages did many things that lowered their expected crop yields but also improved their worst case scenario.
It does make sense if you are trying to minimize risk to use cheap disposable tools when times are good and reserve durable long-lasting tools for when times are bad.
As a species, we are obsessed with RARE. We will willfully kill animals, to the point from abundance to rare, and only then, we will then spend millions to "protect" that rarity.
I mean, for the sake of argument, what's the alternative? Protect abundant resources and use rare resources? Use rare and abundant without discrimination? Or don't use anything / huddle in a hole and die?
More seriously - I think we went very fast from my post of "If I have a unique hand-made-with-love gizmo I will cherish it" to "having things makes assholes" and "we love to willfully kill animals for no reason". And that's fine, we all have our priorities and passionate topics and axes to grind, but I'm a bit sad we didn't get to explore that specific interesting tidbit of human psychology before reducing it / switching to one of the top-10 massive issues of the day...
a species is not a thing. it's a concept - a thought. a concept doesn't do anything, because it's thoughts. so a species cannot kill an animal. a species cannot spend millions to protect. because thoughts cannot perform an action, because they're not a thing.
what does happen is a group of people, which is a thing, kills animals to the point of them being rare. then another, opposing group, spends millions to protect those remaining animals.
i'd expect something like this from a reddit post, where I often see, usually on redneck antivax subs i visit for entertainment, things like: reddit is hypocritical. because the redneck is not capable of comprehending that different people, with different opinions and actions both post on the website.
it's so backwards to think that, but such is some people's life.
I don't think it's necessarely backwards. If you spend your life almost hunting to extinction and then protecting a species, you'll always have something to do.
I like this perspective. I think it applies well to many things we collect.
I've been collecting fountain pens for more than a decade now and I've observed that new collectors end up buying their "trophy" pen only to have it remain uninked and kept in a drawer someplace. It's only a few more years later, when another "trophy" is inevitably collected, that both pens see use.
Similar story with watch-collectors - some people buy their grail, then leave it in a safe.
There's a big mismatch between people who have "pristine" watches, and those that have seen life and suffered the inevitable scratches, dents, and damage.
Yeah. I love watches but I decided I'm not going down that road. That's why I wear just a Casio. Though admittedly it's a $1600 titanium Casio with DLC coating and sapphire glass.. but it's still just a quartz watch and in the grand scheme of things not a huge deal if it gets dented or smashed. Wear it every day, everywhere. Really the only thing I regret about it is the integrated bracelet. Can't replace it with a cheap nato strap when the pins fall off and links vanish in snow.
The first thing I do when I get a new tool or work bench or what have you, is to whack a dent in it. That way I can stop being careful with it and start using it properly without worrying about keeping it pristine.
Yes! This is the exact reason I drive an old car (2008 Prius with 180k miles!). I could buy a new vehicle like a Black Model 3, but then I'd have to go through the hassle of buying and insuring a new car, and then I'd be nervous about scraping it on a sidewalk or getting bumped by a car door. Or what if the fancy new vehicle has software or hardware problems and I have to send it back to the shop for months?
Depends on your local consumer protections. Over here you're entitled to a product that is fit for purpose. A defective product is clearly not that, with or without dent. Of course getting the store to honor that is going to be a lot of effort pretty much anywhere in the world, but I don't feel like a small dent contributes to that.
To take a random example: I bought ear defenders in a hardware store. I used them fairly frequently for a couple of months, after which I put them on my head and the headband promptly snapped right down in half. I had to argue with the store that that's not a normal way for ear defenders to "wear down", but I got a new pair in the end which have been holding up for years now.
That's an interesting story, but to relate it to my previous comment, we have to ask: did you purposely damage the ear defenders for emotional reasons? If so, did you tell them that?
No, but as long as the damage is unrelated to the defect it's not relevant as far as the law is concerned. I have a right to a product that's fit for purpose. That right doesn't specify anything about the state of the product otherwise.
Is it also possible that they aren’t using them because they don’t want to (or don’t know how) to do proper upkeep? They can’t just throw them in the dishwasher like their plastic cutting boards…
Friends valuing your work (even your scrap) to the point of becoming a "white elephant". Maybe use create a stamp that says "use me or lose me, give me away!"
i'd purposely immediately started using a nice butcher block cutting board that my friend's dad (who made it) gave me last year so that i wouldn't fall into that specialness trap. but because it is special, i do tend to use it sparingly and care for it better.
> To find out, we ran another experiment in which participants imagined buying a bottle of wine. We had half of the participants imagine considering opening it one night, but deciding not to.
This is the second study like this on HN in a couple weeks.
When the heck did asking participants to imagine something and then asking them how they felt, and assuming this tells you something valid about things other than imagining things... when did this become a popular form of research?
Seems like you could probably do some effective and efficient research using games. I always have to force myself to use the one shot super weapon, or even a few grenades, in order to not end the game without ever using them.
Video games seem like the perfect arena in which to do and apply this research.
I've identified this problem in my gameplay in roguelikes. It's hard to predict precisely what will be needed in later levels, so I find myself being quite a bit too conservative. I'd love to see research on this behavior.
> the paradox states that an increase in autonomous saving leads to a decrease in aggregate demand and thus a decrease in gross output which will in turn lower total saving. The paradox is, narrowly speaking, that total saving may fall because of individuals' attempts to increase their saving, and, broadly speaking, that increase in saving may be harmful to an economy.
This is the fallacy of composition applied to aggregate demand. That's important to know about, but what others were talking about is the psychology of individual choices. There isn't any aggregation or composition happening there as far as I can see.
It's got to a point where in games like Skyrim I generally just sell the majority of potions and whatnot, because I know when I get down to it I'll always be thinking there's a better time to use it.
For whatever reason, I still can't let go mid-combat, but when selling it I have no issue.
In a lifetime of regret and missed opportunities, this is one area at least where I've learned my lesson. I can now go through a whole game and use all the "special" stuff that I'm given, and usually even at the appropriate times! Kinda proud. It sounds silly but it took a lot of letting go.
Yeah: Imagine you had won the powerball lotto but decided not to cash it in. I can definitely "imagine" tossing it and burning it, etc. If it happened in real life, it would be very different.
It's a form of operationalization and that's a valid critique. My hope would be that there are some studies on the validity of the technique.
That being said, there's probably better ways. Additionally, I'd find it fun to riff on ways to make it better that have the same time/budget constraints as the original.
I'm inclined to agree with you, but I think the answer is actually more nuanced.
At a minimum, it seems like a good way to run a cheap and quick way to try to validate an idea before dumping more time and money into it. If it seems like there is something there, then a longer and more expensive study is easier to justify. And when that study does happen, you'll already have data on what people expect to happen which is nice.
It's not a perfect method for sifting through ideas, but at the end of the day you need some way of deciding what to pursue now, and without time travel there is no perfect method.
And because methods are included in publications, from a scientific standpoint there's nothing wrong with publishing it. Other researchers will read the paper and clearly see what was tested and be able to assign an appropriate amount of significance.
That significance may also vary through time. For example, somebody might conduct research about the accuracy of these imaginings in various ways. Maybe we find out we're actually quite good at them in some ways, and not so good in others. Maybe this is already done, and is why you've seen several recently? I don't know.
The problems mainly come from the popsci aspect. (Some) Writers and (some of) the general public see 'scientific paper' and more or less base their own assessment of significant entirely on that. You can also get a bit of the 'telephone game' going on, and the significance can get all jacked up. I don't know how to solve this problem, but I don't think the correct solution is to alter the scientific process itself.
> At a minimum, it seems like a good way to run a cheap and quick way to try to validate an idea before dumping more time and money into it. If it seems like there is something there, then a longer and more expensive study is easier to justify. And when that study does happen, you'll already have data on what people expect to happen which is nice.
That seems quite reasonable.
But lots of popular press coverage on "look what the scientists figured out" based on your initial validation justifying a longer and more expensive study... maybe less so.
Like, if this really is understood as that... should pop press be covering it at all? This article definitely is written with teh tone "look what the scientists have learned about humans" not "they got enough validation to do more research", right? Hard to say what the researchers themselves think about the strength of what they found out, if they have your reasonableness about it.
But the researchers are clearly happy to give interviews and be quoted; I mean, it's a lot to expect a researcher to give up good press, sure.
> And because methods are included in publications, from a scientific standpoint there's nothing wrong with publishing it. Other researchers will read the paper and clearly see what was tested and be able to assign an appropriate amount of significance.
I wish I had your generosity of spirit about contemporary practice and motivations of academic science.
> I wish I had your generosity of spirit about contemporary practice and motivations of academic science.
Ha! I seem to have given you a false impression. There's... so many problems. However, I don't think most (any?) researchers are actually ignorant of those problems. They are actually doing research somehow, so they've figured out the game well enough.
I apparently managed to disguise this well when I said 'assign the appropriate amount of significance.' The unsaid implication here was that it may well be virtually none, particularly if it doesn't fit with a larger body of work.
In this article for example, it's paired up with another similar finding from an actual physical experiment, which lends it more weight.
> But the researchers are clearly happy to give interviews and be quoted
I always wonder about this. When I've seen longer form interviews with researchers, they almost always seem appropriately uncertain. At least they do sometimes, I might just not be knowledgeable enough about the field to know if/when they aren't.
Meanwhile, you don't get much of that in popsci work, beyond maybe a small disclaimer at the beginning.
I think Adam Ragusea (a cooking youtuber of all people, though admittedly he's an ex-journalism professor as well, and also dealt being the interviewee and having things go poorly) actually covered the intricacies of this really well in a video[0].
I've tried too many times to write "Long story short, ..." here, and have ended up with way too much meandering. Just watch the video. It's worth it, and does a great job of covering the perspective of all parties involved.
I think it's appropriate to assume good faith in individual instances - there are reasonable justifications for misrepresenting the certainty of some work in this kind of context. It's unlikely to cause any actual harm, and probably gets more people intellectually stimulated (which is probably good in both objective and subjective ways). In this article it even lets them make a concrete suggestion that has no cost to try and can probably help some people.
I don't know that there is an objective way to weigh those benefits against presenting a more accurate representation of the certainty of some information, when a misunderstanding of it is unlikely to ever harm anybody. I assign quite a bit of value to that accuracy inherently, but that's coming from the core of my value system. I can't present a logical argument for it's correctness any more than I can the exact degree to which I think human suffering is bad. As a result I think there's room here for reasonable disagreements about where to draw the line.
[0]https://www.youtube.com/watch?v=fxUnwsttr_8
(I think this is very illuminating if you're interested in this topic. I highly suggest finding the time for it, even if you understandably don't find the time to read the rest of this over-long comment.)
> However, I don't think most (any?) researchers are actually ignorant of those problems. They are actually doing research somehow, so they've figured out the game well enough
The problem is that "winning the game" is about publications, tenure, and (less important but increasingly also) popular recognition, rather than about the size of your contribution to valid research results (one hopes that publications and tenure correlate, and yet...).
So, getting results of dubious validity published, and covered in popular press, can be exactly "figuring out the game well enough", an end in itself, rather than a minor step toward a research program.
You are right that I don't want to assume bad faith in any individual instances, and yet... here we are, in aggregate.
What are the costs? I dunno, mis-education of the popular audience, for one, I guess. Maybe that doesn't matter, but, it kind of does? And the opportunity cost of all the scientific labor being spent on "easiest to publish the most times" instead of what might be highest priority to discover. And the aggregate collective effect of a scientific community enabling each other on that.
> The problem is that "winning the game" is about publications, tenure, and ...
Totally. 100% with you there. I didn't mean researchers aren't doing those things, but rather that papers built for those ends are probably not polluting 'science' as a whole. The body of work product is getting increasingly swampy, but the researchers are also the best positioned people in the world to sieve through it.
A large part of their job is to sort out what a paper actually tested and what it actually observed to begin with. These 'game playing' papers are quite blatant in comparison to the kinds of problems researchers already need to look for.
> What are the costs? I dunno, mis-education of the popular audience, for one, I guess. Maybe that doesn't matter, but, it kind of does?
Totally agree. I personally find it kind of abhorrent. I also can't honestly tell you why beyond "because" and am pretty certain most people don't feel as strongly about it as I do. I try to be mindful of that, but I think it mostly just gets used when I end up playing devil's advocate.
> And the opportunity cost of all the scientific labor being spent on "easiest to publish the most times" instead of what might be highest priority to discover.
Sure thing. It also increases time spent on literature searches for all that, and will for quite some time even if things changed tomorrow.
On the other hand, maybe it provides an easier on-ramp for teaching prospective researchers to be skeptical even if it's in a Paper, and even how to analyze them? At least it's nice to try to hope there's some benefit to all this.
> The problem is that "winning the game" is about publications, tenure, and (less important but increasingly also) popular recognition, rather than about the size of your contribution to valid research results (one hopes that publications and tenure correlate, and yet...).
> So, getting results of dubious validity published, and covered in popular press, can be exactly "figuring out the game well enough", an end in itself, rather than a minor step toward a research program.
You're describing Science by Press Conference/Press Release.
This is where gaming-based simulations (usually based on videogames) are apparently catching on as a very interesting realm for psychological experiments.
It's a game and a simulation, so the set-up is low (once the game iself exists), but participants actually seem to get highly invested in the game mechanics. (This might be a stretch for people on HN to believe, but stay with me ;-) And it's possible to run some very-large-scale studies by tapping into existing gaming titles and platforms, particularly MMOGs.
EVE Online is among the instances I'm aware of specifically:
I would guess that a lot of the validation for studies like this comes from marketing research. Marketing often does "study groups" to try to figure out if, say, one product name is better than another. Would you buy "ZimZam Fluid" or "Superlicious"? The study participants are not confronted with the actual product (it may not exist yet) but have to imagine which they prefer. If this approach shows success in marketing (which is all about peoples behavior) then it seems likely it is useful for figuring out other things about how people work and what they will think in specific situations. What people imagine is a clue to how they think, you could think of it as a simulation. People often imagine future outcomes as a way to make decisions in the real world.
What you are also saying is that the way research is funded determines how research is done. You need a reputation to get big money. To get a reputation you have to start small with the little money/time you have. Hence lots of studies are done that are too small, not well controlled, and are essentially useless because no one will believe they have conclusively proved anything. However, once you get 10 papers based on these poor studies you start to look like an expert and it becomes easier to attract bigger grants. By this time you may have actually formed some ideas about what it is you want to research as well and abandon your original ideas as castings in the dark. So it's good in that it helps focus research and helps researchers gain experience, but it is bad in that it results in lots of studies that are not very useful to anyone (like all the medical studies you read about where, when the topic becomes important like in a pandemic, you will find scientists calling out studies as underpowered, too small, not an RCT, and useless.)
It seems possible that making the basis of getting a PhD be making a unique discovery ("your contribution" as it is called) is misguided. It results in the above useless studies to gain reputation, plus it means researchers won't touch each others ideas for fear of being labeled derivative instead of unique. It may be much better to have all scientists in a field brainstorm ideas and contribute them to a common list and then individual grad students and researchers could choose one. The list could be ranked by the same crowd (and hopefully by outsiders as well) and used to assign grant amounts ahead of the selection of an actual researcher applying for the grant. That might result in better and more important topics being studied, an easier path to reputation for new scientists, and more useful studies being done that will actually advance science. Or maybe it would mean that a cabal of scientists and politicians take over the list and misdirect research for decades, which would not be much different from what we have now.
I buy last-year's special thing, and use it up. Phones, cars, bicycles, whatever. It was good enough last year, so it must be pretty good. And I treat it like a tool. Saves a lot of money too.
I long ago categorized that 'too good to use' impulse as a childish one. As a kid I wouldn't want to use up a new box of crayons or whatever, and would end up never using them much at all. Later I recognized this as a feeling that I wasn't good enough for nice things. I consciously shed that attitude, and taught my children to be confident enough to open everything and use it up.
My grandmother had an oriental rug that no one was allowed to step on. For 40 years she tried to keep this rug pristine because she was going to sell it one day. Anytime any of us grandchildren or even the eventual great grandchildren stepped on it, they would get a harsh scolding.
She died never having sold that rug, but after her death there was a lot of fighting over who would get the thing. It tore our family apart, the arguing over it.
Finally the victor took it to an auction house to get it appraised and sold. It was worthless. They put it up for auction and no one wanted it. Eventually the auction house asked us if we wanted them to dispose of it for a fee.
I think about this a lot when I consider letting go of material possessions. That rug has become sort of a symbol to me, of the encumbering nature of our possessions. Or maybe it was just a conduit for my grandmother’s undiagnosed mental health issues. Either way I try to live with fewer things in my life these days.
Specialness fascinates me. And the pointless rituals that goes along with it. They have a strong effect on us, which in some sense makes them not pointless. Like placebo, inert in itself, but effective because we make it so.
It's a good story, and if not for the arguments over who would get it, I could see it having been a nice anchor to remember eccentric old grandma.
Watching shows like pawn stars, or american pickers, or car sos, you see many ways that these types of though processes can play out.
So many people goto the grave never getting around to there hopes and dreams. To the day they die. Often the familys of these people have no idea what todo with these posessions, and its interesting how often their the most treasured item ends up being the least appreciated asset of there estate.
I always attributed it to having a good memory, overloading objects with meaningful stories, and then being unwilling to let them go for fear of losing the memory trigger.
Similarly, my fridge is full of nearly empty jars and bottles of interesting sauces and condiments. If I chuck it, how will I remember to buy that one again?
I used to do this a lot as a kid. I still do it, but I used to, too.
I got a can of Barq's soda as a kid, during a road trip to another city. It wasn't sold in my city so I saved it for something special. It must have sat in my room for years before I realized how ridiculous it was and threw it away. Part of it was that by that time you could easily by Barq's soda in my city.
> Similarly, my fridge is full of nearly empty jars and bottles of interesting sauces and condiments. If I chuck it, how will I remember to buy that one again?
A photo album on your phone might help...you could even mark up each photo with text to remind you what you liked about each thing.
Absolutely great idea. I do it for hot sauces and beverages. Also recipes and grilling times and stuff too, just take a pic or screen grab, & throw it in the album.
Is it just me does this seem like some flimsy research?
I mean asking people to choose between two types of paper and observing that people who choose type A are more likely to choose type A the next time is hardly conclusive evidence of anything. It'd be weirder if it didn't happen.
And if you ask people for reasons why they shouldn't open a bottle of wine just yet they're obviously going to suggest that they're waiting for a better occasion.
For something that doesn't even seem like much of a problem, or at least hasn't been established to be a problem by anything other than the author's gut feeling, this all doesn't seem to amount to much more than just affirming people's preconceptions.
Quite! The article doesn't start off on a great note:
> Fast forward to today. I have never worn my Target blouse. What had started out as ordinary now holds a special place in my closet, and no occasion feels quite worthy of my wearing it.
> What happened here? Why do people own so many unused possessions, treating them as though they are too special to use?
One person bought a blouse they don't wear and then extrapolate that people own so many unused possessions? Where is the data that shows how many people have unused possessions, how much of their possessions are unused and how many of those are unused due to their specialness? The article does build a decent case that the idea that once something is assigned a special status that status becomes amplified in an interesting way that should be investigated, but isn't itself a good investigation of that effect.
> For something that doesn't even seem like much of a problem
Presumably the author’s occupation, “marketing science”, is interested in getting people to use things (and use them up) so that they’ll need to buy more of them, so that they a marketer can then sell them those things.
I remember as a kid my parents brought me a nice rubber ducky. I was very upset when they told me it was marzipan and I was supposed to eat it, after which it would no longer exist. It may have also been a marshmallow peep.
At some point I had a moment of realization that most things are meant to be used. That is, those fancy decorative soaps smell better on your hands than on the shelf, chocolates taste better in your mouth than in the box, if someone gives you a gift they want to enjoy it. Unsurprisingly I agree with the article's advice: use the things you have as soon as it's appropriate to use them. Light candles for dinner, write in the margins of your books, eat off china with silverware. This makes ordinary moments more enjoyable and more special.
Exactly, things have a purpose and more often than not, that purpose is to be used up, possibly repaired and then used again, until it finally wears out or breaks completely.
I tend to baby new things until the feeling of novelty wear off, then I just use them. I don't abuse them, but I stop trying to avoid normal wear and tear. Repairing things is also something I find relaxing and interesting.
As a consequence, I tend to stick to things that are durable and can be repaired, designs and materials that have stood the test of time, and I keep them for as long as I can.
A well-worn patina on an obviously beloved object is hard to fake.
I'm reminded of that one special potion that sits and sits and sits in your inventory, unused, because once it's used it will be gone and where will you find another one? Megalixirs (restore all HP and MP to all party members) and Hero Drinks (the same, plus temporary invincibility to all) are likely candidates in Final Fantasy games, but in Final Fantasy XII Ethers (restore some MP to one character) are incredibly rare and MP recharges slowly over time anyways so why use them? You might need them later...
>Interestingly, participants who had the initial opportunity to use the notebook, but hadn't, were significantly less likely to use the notebook later in the session, versus those who hadn't had the option. And this finding was not limited just to notebooks. We saw the same pattern in other scenario-based experiments using bottles of wine and TV episodes.
eeeehhhhhhhh.... I dunno. "People who don't do a thing once are less likely to do it the next time" seems like it would explain this as well or better, and it's a simpler explanation.
And I don't buy that the imaginary-wine followup question implies "more special" either. By telling them that they did not open it immediately, they've set up that initial "did not use" - there are many reasons to not-use something, many of which work in future unspecified scenarios as well. That doesn't imply it is growing less likely to be used with each event, maintenance is sufficient.
---
I absolutely agree that some things for some people fall into this kind of spiral. They also get out of it sometimes - I've personally had or seen quite a few "argh, why do we even have this?! [use it or lose it]" sequences.
But I think more fall into knickknack-like territory - they were never going to be used, adding more time doesn't change that. It's closer to an emotional attachment to something, and not wanting to lose it. There's no ideal future where it will be used.
One way out of this is if the special item somehow becomes worthless over time. A food item that spoils, clothing that goes so hopelessly out of style that you can never wear it, gadgets that are no longer compatible with anything.
Another is if you can convince yourself that it'll be just as special to some other person, then pass it on to them. But in that case, don't ask about it later, in case you were wrong.
This is the solution. It’s important for people to realize this.
Once people realize that those special items are going to waste, then hopefully a positive change can happen, and those items can be used instead of collecting dust and going to waste.
I have noticed the opposite phenomenon: at some point I lose my reverence for new things and start treating them poorly. Usually electronics like a new phone, new laptop, new headset. When they're new, I feel their value in their weight and physical characteristics. Once I get used to that, they start feeling commonplace and I treat them carelessly, and don't even really feel bad about it.
I have some external hard drives, and some internal SSD's, as well as some USB thumbdrives, brand new in the package. Haven't found anything to store on them yet. I keep re-using old ones. Eventually I'll find one a few years later in some tiny (relative to current sizes) and laugh at how I never used it and originally paid $250 for it.
I have things that are special to me, but nothing that is too special to never be used.
In the past I think I've had similar feelings to what you describe: I buy a lot of stuff, but it's just stuff. These days if I'm going to buy something, I get what I think is the best one and if I can't afford that, I wait until I can. I probably over spend on some stuff, but the wait also leads to me buying fewer things.
I'm still not sentimental about physical things though. I have no qualms about giving away or selling things that were given to me as gifts or bequeathments.
I find it hilarious every time - they're made to be used and abused. I've told every one of them "if you actually manage to use it enough to damage it, I'll just make you another one", but they still sit there in pristine condition.