how to secure a hosted server

It's really coming down to a cake (have, eat) situation :D Why would you consider a hoster? Cause it will save you a lot of money for things like mentioned above. And sure there is a price: trust.
In the practical case of myself, which brought me to these consideration, the reason is that a dedicated server from a hoster is cheaper than a fixed IP address from the telco.
Consequentially, that server then stands somewhere. Next point, obviousely any disk has to be encrypted - because at some point it will break, and then no longer be accessible, and, unless we can physically destroy it, we just do not know where it may end up.
Then, to enter the password for the disk, there are two options. A) store the encryption key on the disk itself, and B) enter it on the console. A) is blatant bogus, as there is only one disk. And B) doesn't work because there is no console.

So, at that point I created a new solution that does geli-encrypt the disk, neither doing A) nor B). (the solution is to store the geli-key within the geli-encrypted disk.)

And now I am researching on how the professional best-practise industry-accepted solution for the issue would look like. But, as it seems, there is none.
 
To secure stored data, we use our own distributed data system. Stealing one physical server won't let the stealer restore the original data. I wish it were open source but not. If a client that is permitted to access all servers is stolen, then the data are stolen though.
 
And that's the problem. Usually we not even have an idea who that hoster might be: you rent a server from some provider, then there are subcontractors who run the compute centers, there are other subcontractors who run a support staff (at changing places around the world), and all this is constantly moved to the lowest bidder.
No. Using a concrete example: You rent CPU capacity or a virtual FreeBSD machine from Amazon (I'm just using them as an example because they're the world's largest hosting/cloud provider, not because I know anything specific about their internals, nor as an endorsement or recommendation). Your contract is with Amazon, not with a subcontractor. Amazon is responsible to make sure only trustworthy people enter the data center. You don't have a contract with the electrical subcontractor, not the security contractor, not even the local fire department (both Amazon and Google have had fires in cloud data centers, I don't know about Microsoft). If someone gets in, that's Amazon's responsibility, and from that viewpoint, the electrical subcontractor will be background checked just as carefully as Amazon employees are.

I've worked for several of the largest computer companies = cloud providers in the world. I've never been inside a data center that contains customer data, and I don't expect I ever will be. Because my job doesn't require me to, and having me inside is an unnecessary security risk. If I walked up to a data center with my employee badge, they would not let me in (and call my manager).

Clearly, you have no idea how actual hosting/cloud companies are run.

There might be very tight controls about who is allowed to enter a server room, but there are no controls whatsoever about who might just buy one of those companies.
In the (extremely unlikely case) that some other company buys Amazon, you will have ample warning, and you have time to remove your data if you think the future owner won't be trustworthy. If you are not paying attention to financial transactions involving your suppliers, that's your sloppiness.

Next point, obviousely any disk has to be encrypted - because at some point it will break, and then no longer be accessible, and, unless we can physically destroy it, we just do not know where it may end up.
Firstly, in real-world data centers of virtualized machines (where the physical disk is under control of the hosting company), the disks are usually hardware encrypted, using SED (self encrypting drives). You typically don't even see that, because the hosting company typically provisions your virtual servers with virtual disks: while those may look like SATA or SCSI disks, they are in reality RAIDed arrays of disk drives, typically encrypted, typically including things like snapshots and remote copies.

Second, data center operators tend to not let any hardware out. Ever. Typically, a large data center will have a shredder, and broken or obsolete hardware (not just disks, but whole computers) are run through the shredder and turned into small flakes of metal and plastic. Sometimes, individual data centers are too small to have their own shredders (those are big and expensive machines), in which case dead hardware is collected on site in containers, then shipped under seal to a centralized shredding facility.

This actually leads to some real-world problems with hardware returns. Say that model X of disk drive from manufacturer XYZ (typ. examples: Seagate, WD) has been failing with unusually high frequency at a customer site, like hosting provider ABC (typ. example Amazon). Seagate/WDs testing lab wants to get one of those disks back, to dismantle it and see what went wrong (been there, done that, got the tunneling microscope pictures of the platters back from one of those vendors). But the hosting provider will never let any disk out. What do we do now? We get XYZ and ABC's engineering teams and lawyers into one room, we write down an accurate contract that describes trust and security, we segregate the failed disks out and mark them properly, we ship them under seal from ABC to XYZ, where only trusted employees take them apart, put the platter under a microscope, and then send the pictures to me with their analysis. Real world hardware vendors and cloud companies put a lot of effort into data safety.

Then, to enter the password for the disk, there are two options. A) store the encryption key on the disk itself, and B) enter it on the console. A) is blatant bogus, as there is only one disk. And B) doesn't work because there is no console.

So, at that point I created a new solution that does geli-encrypt the disk, neither doing A) nor B). (the solution is to store the geli-key within the geli-encrypted disk.)
No. The solution is to store the encryption key in a secure key server. By its nature, that key server has to be even more secure than the systems that use its keys. There is a whole industry of disk encryption, key distribution, and key storage. I know whole companies that do nothing but this, and I know several people who earn their daily living (and have for 20 years) doing nothing but storage encryption key distribution.

And now I am researching on how the professional best-practise industry-accepted solution for the issue would look like. But, as it seems, there is none.
Sorry, there are lots of best practices, there are standards, and this is all commonly done. It is not talked about much, and many details are not made public, out of fear of giving too much information to attackers. If you want secure storage, feel free to go to EMC, IBM, HP, Oracle, Hitachi, Google, Amazon or Microsoft, and contract with them. If you insist in thinking that you are smarter than the thousands of people who have designed these systems, and doing it yourself, you will reinvent the wheel, but this time you will get it triangular. I'm sorry to be harsh, but your paranoia and unreasonable dislike of business isn't going to make your data any more secure, on the contrary.
 
[...] So, at that point I created a new solution that does geli-encrypt the disk, neither doing A) nor B). (the solution is to store the geli-key within the geli-encrypted disk.) And now I am researching on how the professional best-practise industry-accepted solution for the issue would look like. But, as it seems, there is none.
Did you have a look into gbde(4) & gbde(8)? Maybe somehow related to this topic is that Solaris has roles in it's RBAC implementation; i.e. the root user is a role (there's no root login anymore), that can be taken by all accounts of the wheel group. All root's activity is all logged & thus it's always clear who did what; this log is write-append-only, i.e. an admin gone wild can not hide what s/he's doing. You can even configure a 2 out of 3 quorum for certain delicate tasks. I experimented with that over a decade ago; it worked fine. I wish we had that on FreeBSD, too.
 
No. Using a concrete example: You rent CPU capacity or a virtual FreeBSD machine from Amazon (I'm just using them as an example because they're the world's largest hosting/cloud provider, not because I know anything specific about their internals, nor as an endorsement or recommendation). Your contract is with Amazon, not with a subcontractor. Amazon is responsible to make sure only trustworthy people enter the data center. You don't have a contract with the electrical subcontractor, not the security contractor, not even the local fire department (both Amazon and Google have had fires in cloud data centers, I don't know about Microsoft). If someone gets in, that's Amazon's responsibility, and from that viewpoint, the electrical subcontractor will be background checked just as carefully as Amazon employees are.
This is all true but it doesn't help in any way. I did work inside the business, and it was the serious part of the business, banking/healthcare/insurance, big corps, and even there things were not perfect - from the impression I get on lowendtalk, usual hosting business is yet a different beast.

Given Your case, and given something bad happens, what could I do? Sue Amazon? That doesn't help, and anyway they have more money for lawyers.
I've been thru exactly that subcontractor issue in a different matter; check the "priceless-specials mastercard scandal". So now my birthdate de-facto *IS* on sale in the darknet, and all the stupid babble from Mastercard cannot change that anymore.

I've worked for several of the largest computer companies = cloud providers in the world. I've never been inside a data center that contains customer data, and I don't expect I ever will be. Because my job doesn't require me to, and having me inside is an unnecessary security risk. If I walked up to a data center with my employee badge, they would not let me in (and call my manager).
I did work as consultant for banks, and I could have had access to *everything*, the machines, the security, and the machines monitoring the security (not fully legal, obviousely, but also not easily detectable if it were done properly).
(Indeed, we were among those people who are not allowed to share the same airplane, and I also got my prep-for-evac on the 9/11 happening six hours *before* the case.)

So yes, there is all those security precautions You describe below, and the managers will be happy with them. But then there is the illuminati, and the rules do not apply to them. And I know that because I once happened to be among the illuminati.

Clearly, you have no idea how actual hosting/cloud companies are run.
I just need to read lowendtalk to get some idea. But then alright, tell me who You think runs this one:
temp.png


No. The solution is to store the encryption key in a secure key server. By its nature, that key server has to be even more secure than the systems that use its keys.
So this would be the industry-approved best-practise solution, and it doesn't solve the issue, it just moves it onwards to another level.
Sorry, there are lots of best practices, there are standards, and this is all commonly done. It is not talked about much, and many details are not made public, out of fear of giving too much information to attackers. If you want secure storage, feel free to go to EMC, IBM, HP, Oracle, Hitachi, Google, Amazon or Microsoft, and contract with them. If you insist in thinking that you are smarter than the thousands of people who have designed these systems, and doing it yourself, you will reinvent the wheel, but this time you will get it triangular. I'm sorry to be harsh, but your paranoia and unreasonable dislike of business isn't going to make your data any more secure, on the contrary.
No, I'm perfectly happy with the outcome that there is no practical and straight-forward solution, without additional cost, to protect some server rented ad-hoc from whatever shop somewere on the internet.

I do not have the intention to reinvent the wheel, neither to sell or advertise any solution whatsoever - I just do what appears to be necessary.
 
There's this german "saying" :D , I'll leave the translation to anyone interested …

| Das kannste schon so machen …
| aber dann isses halt kacke.

I think IF you have high security needs for a server and decide to fulfill them yourself, there's a high risk this saying will apply to your solution. But hey, you have been warned, in the end, it's your decision ;)
 
There's this german "saying" :D , I'll leave the translation to anyone interested …

| Das kannste schon so machen …
| aber dann isses halt kacke.

I think IF you have high security needs for a server and decide to fulfill them yourself, there's a high risk this saying will apply to your solution. But hey, you have been warned, in the end, it's your decision ;)
I don't consider this helpful.
If there were a generally accepted path to go, then we could argue if DIY would make any sense in that light. In this case there is none (buying support from EMC/IBM/etc. is not an acceptable path, even more, I happened to work for/with quite a couple of these companies and know their products).
 
There IS a generally accepted path, and that is, trust the company you'll do business with. Or, of course, put the other way around: do business with a company you trust. Sure, if you find none, you'll have a problem. But you should face the reality: Doing it yourself will be much worse or much more costly.
 
There IS a generally accepted path, and that is, trust the company you'll do business with. Or, of course, put the other way around: do business with a company you trust. Sure, if you find none, you'll have a problem. But you should face the reality: Doing it yourself will be much worse or much more costly.
Sorry if I fail to understand, but trusting the hoster and NOT encrypting my disk appears to me in no way less costly or less bad than just encrypting my disk.
 
[...] Your contract is with Amazon, not with a subcontractor. Amazon is responsible to make sure only trustworthy people enter the data center. You don't have a contract with the electrical subcontractor, not the security contractor, not even the local fire department [...]. If someone gets in, that's Amazon's responsibility, and from that viewpoint, the electrical subcontractor will be background checked just as carefully as Amazon employees are. [underlining by me]
Contra. No, usually not (not necessarily). The whole story about all this sub-sub-sub-contracting is, as you wrote, to ensure responsibility (in theory). But in realitas, it's about how to get rid of responsibility ;) while retaining a grip on the profit... To rely on the theoretical chain of responsibility, i.e. that the sub-sub-sub-contractor X will perform a background check on the aide of the electrician who repairs the powerline to your server's rack, is not realistic, and reckless. Yes, when s/th goes bad, you can sue your contractor, who will sue his contractor, who will sue... etc.pp., but your precious data will be gone or compromized or whatever damage you might have. I remember that a big IT consulting company was renamed after one of their clients went bancrupt after an IT system "upgrade". Their new name sounds very accelerating.
No. The solution is to store the encryption key in a secure key server. By its nature, that key server has to be even more secure than the systems that use its keys. There is a whole industry of disk encryption, key distribution, and key storage. I know whole companies that do nothing but this, and I know several people who earn their daily living (and have for 20 years) doing nothing but storage encryption key distribution.
Then Kerberos comes to mind as a well-established & mature, proven solution. Maybe it could be "misused" for this topic.
Zirias Redewendung=idiom & Sprichwort=proverb
 
Zirias Redewendung=idiom & Sprichwort=proverb
And I consider it neither of them, but, well, that's really off-topic now ;)

Yes, when s/th goes bad, you can sue your contractor, who will sue his contractor, who will sue... etc.pp., but your precious data will be gone or compromized or whatever damage you might have.
Well, but exactly this scenario is why anyone involved WILL try to avoid that.

You always have to compare. Your single server at a normal building is exposed to much more risks. For my private stuff, this is fine, I have my own server, a RAID with encrypted disks and somewhat regular backups will do. For a business? You want to have redundancy, physical security, and so on. Doing anything that's necessary here yourself will be extremely expensive.
 
security wise it boils down to: (own hardware manufacturer + datacenter) > (own datacenter) > (hired rack in a datacenter) > (bare metal server at a hoster) > (VM at a hoster) > (container at a hoster) > (shared hosting/webspace/accounts)
 
Did you have a look into gbde(4) & gbde(8)?
I did, but that was back in 2009 - and then it did not look very promising, at least for the things I wanted to do back then.
From what I get out of this thread, geli seems to be the more favored solution.
Then Kerberos comes to mind as a well-established & mature, proven solution. Maybe it could be "misused" for this topic.
Do You have Kerberos employed somewhere?
I am using it, it is a bit of a difficult beast. It is best suited where you have a homogenous landscape of machines that are all equally secure. When you start to differentiate intranet/perimeter/etc., it gets rather complicated.
 
Security mainly boils down to trust; but from what has been unveiled of human nature, the latter is at least a complex topic, to say it politely. A psychologist told me once the most convincing definition: eventually, trust is a decision you take. It has been shown several times that open source systems are inherently safer than closed source systems. So why should someone rely on e.g. Dell/EMC or Big Blue or whoever else sells closed-source systems? Nowadays even open source hardware exists. Hopefully this laptop I'm using right now is the last closed source machine I own. I do not even have the ME's password, can you beleive that? That's very frustating, and coreboot or thelike is not certified for this machine.
 
Do You have Kerberos employed somewhere? I am using it, it is a bit of a difficult beast. [...] When you start to differentiate intranet/perimeter/etc., it gets rather complicated.
Maybe that's the means of existence for the key-handling companies that ralphbsz mentioned above? I.e. eventually their service is to hide the complexity of Kerberos? Which is perfectly justified, since it saves their clients some good deal of headaches.
 
As for the "how do I enter the encryption key when my (remote) server boots and wants to un-encrypt the disks?" question: you configure your server so you have ssh access into the environment where the key needs to be entered, you ssh in, enter the key and the server continues booting.
No, unfortunately, I don't know how to set up this with FreeBSD.
 
As for the "how do I enter the encryption key when my (remote) server boots and wants to un-encrypt the disks?" question: you configure your server so you have ssh access into the environment where the key needs to be entered, you ssh in, enter the key and the server continues booting.
No, unfortunately, I don't know how to set up this with FreeBSD.
That's fancy.
When the server boots, that means, for full disk encryption, in the loader.conf - so one has to add full network capabilities to the loader code, and full sshd support. No way.
The other thing is IPMI. But I have no idea what this can do, I have never accessed it. (I never bothered to access a console of hosted server - I don't need a console to setup a FreeBSD.) I might assume the IPMI can give access to keyboard+screen - but then it is doubtful how that would work with ssh.

The other option is to NOT do full disk encryption. The system can then boot from unencrypted OS installation, bringup a network and sshd, and then receive the key to decrypt the application filesystem. This is doable. It requires a bit of rearranging the rc.d scripts (depending on what exactly is contained in the unencrypted/encrypted disk part), and brings the problem that somebody might tamper with the unencrypted OS installation and drop in some backdoor there.
 
Security mainly boils down to trust; but from what has been unveiled of human nature, the latter is at least a complex topic, to say it politely. A psychologist told me once the most convincing definition: eventually, trust is a decision you take.

Not quite to the tune of the Beverly Hillbillies but close enough to the Ozarks where they lived.

*sing*
Let me tell ya'll a story full of dread
Bout snooty people who let schooling
got in college go to their fat heads

They hired a GED guy to do what they said
they thought dumb as a cob and looked cornfed
They looked down their nose at him and smiling said
Here's a floppy for our AppleII sitting there to be fed
never thought he had a chance of one in his head

He never touched one in his life but dummied up
to let them know that he'd rather blowed his head off
That country boy had more smarts than they thunk
and 3 days later had figured out that box, owned it
their printer and data on all floppy entered in his head

His mental health skills they could never hope surpass
make fools of grads a game long his to make time pass.
He was their Superior in every way but thought a fool
The surprise fun in that game setting fire to their ass
A character flaw he saw, owned, overcame, and got past.

That's what can happen and continues to this day
a fire alarm in St. Louie someone might hear Monday
But it won't be my homie tasked to put out a firestorm
It's time set to blaze and only to help keep him warm
*sing*

If that's hard to process and not understandable, it only seems that way. I've told that story before and the rest dues owed the Red Devils Advocate by people of less than Professional Character and devoid of Ethics required of their position.

I'm well equipped for mine and this not my character flaw to own, it's still the same game but one they played on someone with a type of Quantum Entanglement going for him. Not Spooky action at a Distance. Stupid looking at a distance, and far beyond those in sight can perceive it somehow.

It's a blessing and a curse.
 
How do the military, top-level governmental & financial institutions etc. handle this issue? AFAIK they start @ the bottom with a leased line/dedicated wire (you could rent that from your telco some time ago, when I was in the business world). And the next layers? FMLU a dedicated wire is not strictly necessary for security, if you use encryption with end point authentication. It's benefits are predictable latency & bandwidt & decreased attack surface (noone else can (phone) call that number because there is none (but Obacht!: s/o inside the telco can do that)). I dunno if you can still rent a leased line/dedicated wire (the real physical copper wire or optical fibre threads reserved solely for you). Maybe nowadays, where it's all routed through ATM, that's not necessary anymore, and maybe not even possible. I remember we had issues concerning analog switches (inside the telco boxes alongside your pavement) on the route of our leased line. You can make a contract instead that ensures a certain latency & bandwitdh, fine, that's what you want, because you encrypt your traffic anyway, and (as noted above) shows, there is no true benefit in terms of security when using a leased line.
Sorry for the intrications - I tried, but couldn't do better.

Isn't there some secure encrypted OOB mgmt IPMI/BMC/ILO/whatever available? I can't believe that. Beeing oldschool, I remember solutions like
leased line -> modem -> [TA (multiport terminal adapter; cua0....n) -> box in a rack | box]
In this oldschool example, the critical point is how to authenticate the box's encryption module. Does geli(8) offer that? If not, using geli(8) in a remote setup is phony/fake security.
 
How do the military, top-level governmental & financial institutions etc. handle this issue?
Let's look at typical "spooks", meaning government agencies, typically intelligence, law enforcement, and military.

Networking, traditional answer: There are no communication wires going into the data center. None, zip, zilch. Modern answer: the only communications wires going in and out use dedicated circuits that go to other data centers of the same organization, and use very strong hardware encryption on those circuits. For example, all major cloud/hosting providers (those that have more than one data center) encrypt their dedicated circuits that leave buildings, after they found that certain government agencies were wiretapping their circuits (yes, the NSA was spying on the likes of Amazon, Facebook and Google). The big ones in the industry all own their own cables (yes, the likes of Amazon, Google and Microsoft own their own terrestrial and undersea cables). If the site has to be connected to the public network, this will be done with carefully designed firewalls; the slang term for that is "spanning the air gap" (a bridge that goes across the air gap that separates internal and public networks).

Personal communications, traditional answer: while inside the building, nobody will have a cell phone, radio, or camera. The only laptops that can be brought in and out are those where WiFi can be turned off in hardware (many name-brand laptops have switches for that). All communication is via desk phones, usually some encrypted IP phone (they tend to look like Cisco hardware). Modern answer: Agency-issued cell phones can be brought into certain parts of the building, for example conference rooms, but typically not personally owned ones. Typically, there will be sniffers and jammers to prevent communication from happening. Forget Bluetooth and WiFi. And USB sticks are a complete no-no.

Access: At the gate, there is a set of guards. They have assault weapons. They check your credentials, and only let you in if your are authorized to enter. That includes personnel of the organization itself (so for example NSA employees who have a good reason to be in the data center, and 99.5% do not), and a very small set of employees from contractors. Typical contractors would be SRI, IBM, General Dynamics, Oracle, Lockheed-Martin, EMC. The term "contractor" explicitly does not include things like electricians and plumbers, that's handled by in-house staff. Everyone who has access to the building will have government security clearances, even contractors. The organization has its own emergency response systems (fire, internal law enforcement such as military police, ambulances).

Anecdote: My wife has worked at one of those government labs. Her office neighbor (who was quite elderly and of ill health) walked into her office slowly, and said: "I'm sorry to bother you, but I'm feeling very bad, I think I'm having a heart attack, can you please call an ambulance for me." My wife (who is knowledgeable about heart disease) obviously immediately grabbed the phone, and called the emergency number, telling them that someone is having serious heart problems. About half a minute later, she heard heavy steps in the hallway, and outside saw a military-camouflaged HumVee, and then a half dozen marines with machine guns got into her office. That's the organizations standard response too any emergency. Fortunately, some of them were trained as EMTs (paramedics). Another minute later, an ambulance showed up with sirens, and brought a stretcher into her office. Another minute later, the on-site staff doctor showed up. Her colleague did fine.

In some government sites, visitors who need to be present but have no security clearance can enter certain buildings, but are escorted. At Livermore for example, in certain buildings they are escorted by an armed security guard who brings a colored red light on a small stand with him, which he carries in the hallway, then places in front of whatever room the uncleared visitor is in. I hear that this is a rare occurrence, as these facilities typical have a "visitor center" near the entrance where non-cleared people can have meetings. Been there, done that. For example, the NSA today has a publicly accessible museum and gift shop (yes, random people without security clearance can buy a coffee mug or sweatshirt that says "National Security Agency", a few friends of mine have those). In the old days, the bigger vendors had their own secure rooms in these facilities. For example in mainframe days, a typical data center would have a little office for IBM personnel, where they could hang out, drink coffee, and store tools and frequently used field service parts.

In some installations (typically military intelligence), all people in the building are armed. Meaning a typical "sys admin" (today's term for that job description is "SRE") will have an assault weapon. For a military site, that makes some sort of sense: their intelligence data processing would be among the places at the highest risk for being attacked. Yes, I've been in (phone- and video-) meetings with people who are all in green uniforms, all armed, and all use assumed names, typical by alphabet in order they go around the table. We all laugh about that, when people get confused: "As my colleague Fred just said ... oops sorry, as my colleague Charlie just said, I got confused because he was Fred in yesterday's meeting ...".

Hardware: Only hardware that has been ordered and vetted can enter the building. I've heard horror stories about field service spare parts being delayed by days (while systems are down), because they need to be shipped to a receiving organization in a non-secure facility (where UPS and FedEx drivers can enter), then checked that they are the real thing, then internally shipped to the data center.

How does technical support work? Typically suppliers of hardware and software have a small set of field service people who are security cleared, and can work on the hardware. There are always protocols to be followed: Do not look at actual user data, do not communicate about the size, design or content of the system with your colleagues, do not take photographs, only take handwritten notes, and all handwritten notes need to be checked by a censorship organization before leaving the building. Typically systems are compartimentalized: The service person from IBM looking at servers obviously is not allowed to get near the Cisco networking hardware, and vice versa. Debugging under these circumstances is ... difficult.

Typically, no hardware ever leaves the building. Typically, sites have shredders, and hardware that is obsolete or broken is run through the shredder, and not sold used or returned to the vendor. If hardware has to leave to be diagnosed off-site (for example a disk platter needs to be put under a scanning tunneling microscope to see why so many read errors happened), special protocols are put in place (like a few engineers at the vendor will be specially security cleared). Some sites insist on all persistent storage devices (disks, flash) also implementing hardware encryption (SED or FIPS-140), although that's not actually very common.

By the way, most of the the large vendors of cloud and hosting services work pretty much the same way, except typically not with soldiers as security guards, and typically using subcontractors for things like plumbing and electrical work. This does not apply to small banks or small medical services though, they don't have the scale to be secure.

And just to be clear: I've never had a security clearance, I've never been in a production data center, but I've worked closely with folks who do this kind of stuff. For example being on the phone with them to help, while they're in a conference room with agency staff and a censorship person.

Final anecdote: In a previous job, we had a meeting (at my company's office) with a senior staff member from one of those agencies. Very smart, friendly, and intelligent person. He was kind enough to give me his business card, and it said something like "Government of Elbonia, Central Security Agency, Dr. Adam Bob, Strategic IT design planning", with the flag of Elbonia in the corner. But it had no address, phone number, e-mail, or fax!
 
Great. Very anecdotal, nice to read, etc.pp. I like this kind of stories. Can you also supply a story like that, which explains how to turn an anecdotal, nice to read story into a concrete answer to any of the concrete questions above?
This is of course nonsense, but I let it stand there strikethrough so that it's clear what I wrote...
 
ralphbsz has copious credible credentials to back up everything he talks about, exceeding excellence in essential experience and always knows what he's talking about.

I would not think to question his knowledge of or experience in anything he spoke of if it were me.

But your experiences not mine and my experiences not his. My experiences differ greatly and why our posts related to his are diametrically opposed.

I know from my perspective there is no need to. You are left wondering what comes from his perspective has beans to do with the frijoles you ordered.
 
MSG(intended) -> SENDER -> MSG(word-by-word) ->RCPT -> MSG(understood)
Usually: intendedunderstood (approximately but not actually equal to)
But sometimes: intendedunderstood (neither approximately nor actually equal to)

ralphbsz, I'm sorry, I discovered that this is a bad misunderstanding... I asked for "other next levels", to which you answered in great detail; just to receive my offhand reply... I was thinking about how a remote admin can be sure that the path from the receipient s/he's typing the password or keyphrase to (e.g. sshd(8)), to the geli(8) that receives this secret, is secure.
EDIT And how s/he can authenticate that geli(8); an attacker could have exchanged the disk with a fake device. Since it's remote, the admin can not see that./EDIT
Of course we know I neither wrote that, nor anything near it. That's clearly my fault, please accept my honest apologies.

Thank you in advance, you're very generous.
 
I was thinking about how a remote admin can be sure that the path from the receipient s/he's typing the password or keyphrase to (e.g. sshd(8)), to the geli(8) that receives this secret, is secure.
And the answer to this is: This depends on the design of the hosting site. Let's try to solve an easier problem first: small customer hosting just a few machines from a commercial hosting provider, again encrypted file system, at boot the file system needs to be unlocked, but instead of doing that with a passphrase coming from a human outside the data center, we'll use a key server that's inside the data center. To be reasonably certain that the key server and the networking within the data center are secure, we need to be able to trust the people who designed and implemented those. Most likely, we won't be able to audit and verify the design, since they will be kept secret for security reasons. Even when communicating with really large customers (those that spend a billion per year on computing services), hosting/cloud companies will probably not talk about all technical details. So all that's left is trust: you look at the provider, see how many security incidents they have had, look at their general stance and attitude towards details and being careful.

Adding having to enter the passphrase via a networking connection from the outside makes the problem much harder, since now the whole network path from one's home machine (where the passphrase is typed in) to the server has to be considered. Usually, we assume ssh login (and in general SSL-protected TCP/IP connections) to be secure enough, but there are lots of difficult details there, such as authentication of the endpoints. Another issue is exact boot order: what is running on the machine before the file system is unlocked?

Here is what I would do if I wanted to host a (virtual) machine at a hosting provider (which I actually do): read their documentation for how data is stored on their systems, understand how they encrypt internally, and then either use their default configuration, or not use hosted machines at all. If you can't trust the provider enough to create a reasonably secure encrypted storage solution, you can't trust them with other stuff either.
 
What I do is keep the storage at home. My virtual servers have no state, they're basically just public IPs that handle incoming connections and send data back to my home machines over an encrypted tunnel. Sure the data in-flight can still be compromised, but this approach minimizes the exposure.

There are severe disadvantages to this scheme, though. Power outages at your home are annoying. A fire or natural disaster would be fatal.

Thanks to the insane third-party doctrine in US law, however, anything you keep stored in a provider's systems for more than three months is accessible without a warrant. I figure having to get an search warrant for a private home is a somewhat higher burden, but I may be naive.
 
Back
Top