Three if by air
My favorite teams
Tons of companies were hit by the prep. But all in all, if you take an outdated computer that cost you $10,000 specialized for your business about 15 years ago, then replaced it with a $1000 machine from that day, you create an opportunity to improve everything. By asking if anyone was"hit", you want to know if some company went under for either working on the date bug, or by ignoring it. It's unanswerable, because this assumes a company was in some vacuum that could only buy new machinery, hardware/software from some magical vendor that ignored Y2K, or some company was so negligent to begin with, they had no upgrade path to begin with.
The TL;DR answer
Think of a business in the worst situation, as far as legacy computing in place. AT&T maybe? They had infrastructure that was being updated regardless. Smaller Banks still required to be in compliance. So really, maybe a few ATM manufacturers couldn't secure the software licenses at the price they needed to compete? They couldn't honor service agreements, unless they did something. Anything that needed to spit out a reciept, for law/reg compliance or other reasons.
I'll tell you what though, a LOT of businesses still running on IBM System 3x mainframes and all the WARP servers disappeared within 18 months. It wasn't the servers or the OS that made the date-bit issue a problem, but many layers of software specially made that just created a fresh start type cycle. This DOES cost a lot of money. When you plan to do your upgrade cycles, and suddenly everyone is inflating the market for new hardware and software licenses... it's havoc.
The bigger issue IMO, we had a steady flow of programmers pop up every year that contributed and progressed development for all of society. When you have a panic like Y2K, programmers that normally made $50k a year, suddenly were making $300k a year(in all fairness, there were multiple demand shifts at this time, but Y2K was the single biggest). That's a fair bit more than a number of people in medical specialties that had equally technical jobs. We also had a fair consensus that C/ASM were best to learn for any CS student, because the performance and resource usage is simply hard to beat, and the skills learned in such languages translate.
Therefore my opinion is; businesses created a short term bubble which swayed a lot of bright kids away from other fields, gave them the impression that programming is the way to go for money(it also suffers from the demand drop when you hit 40ish).
Because you needed more lower skilled coders at the time to reproduce software solutions, the world developed a demand for power and resource hungry, sloppy coded software. Crap code just to get things moving, but we traded a lot away to do that. Java was for proof of concept, fill in the gaps, and maybe some appliances. Even kids into programming knew this before Y2K and only learned java for edge cases. People go where the money is, and at that time non-tech friendly executives were sold that cheap code is the only way to prevent architectural shifts, because IP is expensive, hardware is cheap. I don't believe we have ever recovered from that line of thought, and we are worse off for it, by a few orders of magnitude.
Cheap code = cheap fixes = still broken.
By the way, Hardware and Software are equal to each other. in terms of development need in the world. Hardware is always going to outperform a software solution. The only difference is that you can know software with knowing very little about hardware, but it's close to impossible to know hardware without knowing software. Like peeing without pooping, but trying to poop without peeing.
A lot was done to fix Y2K, so people don't think it was anything. A lot of hub-bub. But the truth is, there are still issues that are being uncovered. It's like knowing the dam has a crack in it. Well, if you fix the damn dam before it breaks, nobody remembers damn thing. Ignore the dam, and damn that sucks.