Hah so true. Absolute idiocy.
Being artificially tied to the internet is basically the easiest way to "damage" a consumer. So naturally that is why 99% of commercial software does its best to tie a consumer to the internet.
It is getting very close to Stockholm Syndrome for consumers. The more they are treated like absolute dirt by these (effectively criminal) corporations, the more they respect, revere and almost worship them.
Either way, Apple not displaying a copyright correctly is the least of our concerns. What is a little bit sad is that if the FreeBSD Foundation did try to fight them, they possibly would not win, even if Apple had actually broken a license agreement.
Sorry, no. Think about the alternatives.
Let's consider a typical consumer, who has one or more devices (laptop, smartphone, tablet, for this discussion they are all the same). They want to do something on their device, and doing so will create some data that they want to be persisted (meaning they can read it again later).
Observation: Most computer users are nearly continuously connected today, but with varying bandwidth and latency. That's simply a verifiable fact. For a fun story, earlier this year I spent 3 weeks in rural southern Brasil (family matter), and that's not a super-rich high-tech area, completely on the contrary. My cell phone was nearly always connected, even on the 1-hour trip on a provincial road between the district capital and a small farming village where my mother lives. Navigation and messaging worked nearly always (because of the nature of the family emergency, I was in constant contact with my siblings, who live in different continents). Many of the "big buildings" (hotel, restaurant, government hospital, court house, ...) had free WiFi which simply worked. Boring. Actually, where I live (in the mountains on the edge of Silicon Valley), connectivity is actually slightly worse than in rural Brasil; there are several stretches of road through uninhabited areas on my daily commute where my cell phone is disconnected (no phone calls) for several minutes. And this anecdotal data agrees with what the computer industry knows: connectivity is nearly always available.
Now lets consider the situation of our consumer. In a nutshell, they want to "save a file". Let's say we are in charge of the software to do that. Where shall we save it? On the device they're using right now? That is the traditional choice, but it is a bad idea for a variety of reasons. First, the consumer expects that data is seamlessly available on all devices; they expect to open the file they just saved on their laptop using their phone, in 1 minute and next year. So it needs to go "in the cloud", like it or not. Second, the storage on the device is by its nature unreliable: drop the cellphone in the swimming pool (been there, done that), and the file is gone, which will get the consumer very upset. Not at themselves for dropping the phone in the pool (people tend to not take responsibility for their actions), but the provider for making a system that broke due to a small mistake. Third, any single storage device is too unreliable for our expectations of data survival today. Disks have expected error rates, and with the size and error rate of disks today, data loss is no longer a remote possibility, but a near certainty. There is the (very accurate) observation from the CTO of NetApp that storing data with fault tolerance to only 1 disk error is today "professional negligence". We need to make at least >2 copies of the data, and they need to spread over multiple failure domains (not all in the same physical location, not all using the same power grid).
Now take that consumer requirement, and put it together with the observation above, and the solution jumps out: rely on the network connection for data safety. Really, there is no sane other option today. You may not like it, you may be a technology luddite who wants to remain in the 80s and 90s, or you may be an anarchist who doesn't want to leave a trail of evidence online, then you can reject the solution, but for most consumers, the right solution is to put all data in the cloud.
That leaves one little problem: What to do for the rare cases where connectivity is missing or insufficient? And this is where software development gets really hard. You need to start leaving cached copies of the data in various places where it likely to be used. The moment you do that, you have a consistency problem that's somewhere between hard and impossible to solve. Go back and read Satya's Code paper: The solution he outlines (eventual consistency with manual conflict resolution) is an attempt to do this, but an attempt that today would get laughed out of town. So today a lot of effort goes into consistent caching, and resolving updates.
So what does this all say? Relying on the "internet" being on is nearly always the right answer. If you don't like it, feel free to unplug your ethernet cable. But don't expect providers of software or systems to cater to your whims; they are trying to serve the bulk of the users well.