I agree, modern definitions exclude 1 since "we lose" unique factorization. It's interesting to note [1] that this viewpoint solidified only in the last century.
No, 1 is excluded for reasons closely related to, but not conceptually identical with, the one you mention.
The "intuitive" argument that 1 is prime is that, as with prime numbers, you can't produce it by multiplying some other numbers. That's true!
But where the primes are numbers that are the product of just one factor, 1 is the product of zero factors, a very different status. The argument over whether 1 should be called a "prime number" is almost exactly analogous to the argument over whether 0 should be called a positive integer.†
It's more broadly analogous to the argument over whether 0 should be called a "number", but that argument was resolved differently. "Number" was redefined to include negatives, making 0 a more natural inclusion. If you similarly redefine "prime number" to include non-integral fractions (how?), it might make more sense to consider 1 to be one.
† Note that there is no Fundamental Theorem of Addition stating that the division of a sum into addends is unique. It isn't, but 0 is the empty sum anyway.
3 is also the product of the sets {3, 1}, {3, 1, 1}, etc.
We’re excluding the unit when defining these factor sets (ie, multiplicative identity) because it removes unique factorization.
That 1 is the unit is also why it’s the value for the product of the empty set because we want the product of a union of sets to match the product of a product of sets. But we don’t exclude it from the primes for that reason.