Do you have to deal with servers without internet access at your job?

yzTyzT Member Posts: 365 ■■■□□□□□□□
We got like 50 servers which don't have direct access to Internet. The servers that need access to Internet (Apaches) get the petitions forwarded from another network, but the server itself can't access internet.

This is such a pain and the result is servers running version of packages from 2008, etc. When I do want to install something, I have to manually install everything looking for the dependencies, dependencies of dependencies....

I'm wondering whether this is common, or just my organization is **** me off...

Comments

  • timrvttimrvt Member Posts: 28 ■□□□□□□□□□
    yes. we have a dmz network where web/db servers sit & "talk" to the proxy web server on an internet accesible network
    along with limited connections to the internal corporate network

    what we do for a quick fix/updates is setup a secondary nic card on the internal network,update as needed,remove the nic...
  • Tom ServoTom Servo Member Posts: 104 ■■□□□□□□□□
    It is very very common, and isn't just the organization trying to piss you off. If is for security. Granted, security often impacts usability. Additionally, it may be a regulatory requirement (for instance, PCI) that those systems not have direct access to the internet.
  • jibbajabbajibbajabba Member Posts: 4,317 ■■■■■■■■□□
    That is why you can create your own repository .. In your case you could create a server with access to the internet from which your 50 servers can download the patches / packages from. if worried you can always disable the internal nic of the patch server until such time you need to patch / download anything from it. Having said that - with decent firewalls you shouldn't need to.
    My own knowledge base made public: http://open902.com :p
  • CIOCIO Member Posts: 151
    Yes, it's a very common practice
  • mikeybikesmikeybikes Member Posts: 86 ■■□□□□□□□□
    Our servers do have Internet access, but we are working toward eliminating it. All Windows updates are managed through a local WSUS server. For the handful of Linux machines we could easily mirror the vendor's repository. Not providing Internet access unless absolutely necessary to any machine is just good security practice.
  • 4_lom4_lom Member Posts: 485
    We have multiple clients with terminal servers that do not have access to the internet.
    Goals for 2018: MCSA: Cloud Platform, AWS Solutions Architect, MCSA : Server 2016, MCSE: Messaging

  • the_Grinchthe_Grinch Member Posts: 4,165 ■■■■■■■■■■
    Worked on a number with no internet access. I was lucky in that the software I needed to install had no dependancies and I was able to just map my local drive and upload it that way. WinSCP is your friend in such situations ;)
    WIP:
    PHP
    Kotlin
    Intro to Discrete Math
    Programming Languages
    Work stuff
  • yzTyzT Member Posts: 365 ■■■□□□□□□□
    jibbajabba wrote: »
    That is why you can create your own repository .. In your case you could create a server with access to the internet from which your 50 servers can download the patches / packages from. if worried you can always disable the internal nic of the patch server until such time you need to patch / download anything from it. Having said that - with decent firewalls you shouldn't need to.
    I'm interested in this. How do you do that? Does it matter if servers have different version ? (RedHat 5 and 6)
  • docricedocrice Member Posts: 1,706 ■■■■■■■■■■
    Servers which have direct Internet access are generally considered very risky. It's one thing to allow inbound traffic initiated to servers in a DMZ for designated external services like DNS, mail, web, and so on, but to have those same servers or other hosts which only serve internal functions be able to make connections to the Internet is really asking for it.

    If an actor with malicious intent, whether a rogue employee, an external attacker, or malware, is able to connect outbound and grab tools for further exploitation or phone home to a mothership, there are some serious risk and potential repercussions an organization takes on unless the firewall(s) are only allowing to very specific IPs or netblocks. But since patch/update/upgrade servers and repos could be scattered in CDNs, allowing access to those resources by IP means opening to very large net blocks which widens exposure considerably because those same blocks could easily be home to other undesirables. Some firewalls can do FQDN-based filtering, but those have drawbacks since the IPs which the firewall resolve to may differ slightly from what the hosts themselves do from moment to moment, especially with public update hosts whose DNS TTLs have short lifetimes.

    The only meaningful method is either a whitelisting proxy service ... or for repositories, a centrally-controlled repo handled internally by your organization where the packages are validated and signed-off on for consistent rollouts.
    Hopefully-useful stuff I've written: http://kimiushida.com/bitsandpieces/articles/
Sign In or Register to comment.