The Door That Undoes Everything
You've built a genuinely impressive technical security program. Your SIEM ingests 80,000 events per day. Your EDR has 99% coverage. Your vulnerability management program closes criticals within 30 days. Your penetration test came back clean. You're running MFA everywhere. You have a red team.
Your server room door has been propped open with a fire extinguisher for three weeks because the facilities team is waiting on a part for the door closer.
That extinguisher is your security posture in one image. Everything upstream of physical access — every layer of logical security, every network control, every authentication mechanism — has a bypass condition: an adversary in the room. Physical access to a running server is game over before incident response even gets paged. Physical access to a network closet means an attacker can install a tap, a rogue access point, or a persistent implant that will survive your next OS rebuild. Physical access to an unlocked workstation means your six-figure DLP implementation is a footnote.
Physical security is not a separate domain from cybersecurity. It is a prerequisite for cybersecurity. The convergence of Physical Access Control Systems (PACS) and logical Identity and Access Management is not just an architectural preference — it's a recognition that the same identity that should have MFA on the VPN should also require authentication to enter the data center. If those systems don't talk to each other, you have gaps that an attacker with basic social engineering skills will find.
Tailgating: The Attack That Never Goes Away
Tailgating — following an authorized person through a controlled door without presenting your own credentials — is the most common physical security attack and the hardest to eliminate through technology alone. It works because humans are polite. Holding the door for someone with their hands full is a social norm that most people don't override, even in secured facilities, even with security awareness training.
The technical countermeasures for tailgating are well-established: mantrap vestibules (two-door airlocks where the second door won't open until the first is closed and only one person has entered), turnstiles, optical detection systems that count the number of bodies passing through a controlled entry, video analytics on door events. These controls exist, they're deployed in high-security facilities, and they're expensive and operationally disruptive enough that most organizations don't deploy them broadly.
What most organizations actually rely on is badge access plus camera coverage plus the assumption that employees will challenge unfamiliar faces. That last component is the weakest link. Security culture around challenging tailgaters requires consistent messaging, management modeling, and a reporting mechanism that doesn't make employees feel like informants. It also requires that the physical security team actually follows up when challenges are reported — if employees challenge tailgaters and nothing happens, the behavior stops.
The red team technique for tailgating is straightforward: carry something that makes it awkward to swipe your own badge (both hands full, a dolly with boxes, a large monitor), approach a door as someone is exiting, look purposeful and slightly rushed. The exit event is actually easier to exploit than entry — people leaving through a controlled door will almost universally hold it for someone approaching from inside, even though that person could be an unauthorized person who piggybacked in on the other side. Entry and exit are both attack surfaces.
Badge Cloning: Less Exotic Than You Think
The Proxmark3 is a research-grade RFID tool that can read, analyze, emulate, and clone proximity cards. It is not a sophisticated nation-state tool. It is a device you can buy for under $400 that fits in a jacket pocket, and it can read a standard HID Prox card from several inches away with the right antenna configuration. The attack is a passive read in a crowded space — an elevator, a lobby, a parking garage — followed by cloning the credential to a blank card.
HID Prox (125kHz) cards are vulnerable to this by design — they're a decades-old technology that was never intended to resist active cloning attacks. If your physical access control infrastructure still relies on 125kHz proximity cards, you're operating on a technology standard from the 1990s that the security research community demonstrated was clonable without sophisticated equipment years ago.
The upgrade path is iCLASS SE or SEOS (also HID) or MIFARE DESFire — 13.56MHz smart card standards with cryptographic authentication that resist cloning. The migration is expensive, time-consuming, and requires replacing readers, cards, and potentially back-end infrastructure. Most organizations delay it for years. The result is a PACS infrastructure that looks secure — cards, readers, door controllers, management software — but whose fundamental authentication mechanism is defeatable with a commodity tool.
Wiegand, the communication protocol that runs between badge readers and door controllers in most legacy PACS installations, has its own vulnerability: it transmits credentials in cleartext between the reader and controller. Someone with physical access to the wiring behind a reader can intercept and replay credentials. This is a well-documented attack in the physical penetration testing community. OSDP (Open Supervised Device Protocol) is the encrypted replacement. How many of your door readers use OSDP? If you don't know, the answer is probably not many.
USB Drops and the Rubber Ducky Problem
The USB drop attack sounds cartoonish until you look at the research. Elie Bursztein's 2016 study dropped 297 USB drives across the University of Illinois campus and measured how many were plugged in. Forty-five percent were plugged in within 24 hours. People picked them up. People plugged them in. Often because they were trying to figure out who they belonged to.
The Rubber Ducky (Hak5) is a USB device that presents itself to the host OS as a Human Interface Device — a keyboard — and types a pre-programmed payload at hundreds of keystrokes per second. No driver installation, no AutoRun, no user prompt. The OS sees a keyboard. The payload executes. A user who plugs in what looks like a USB drive can be fully compromised in under 30 seconds through a keylogger install, a reverse shell, credential harvest from the browser, or all three.
USB Rubber Ducky has been succeeded by more sophisticated variants. The O.MG Cable presents as a charging cable with an embedded implant. Bash Bunny can switch between multiple attack modes including network impersonation and storage device attacks. These are commercially available, documented, frequently updated, and used both by red teams and by actual threat actors.
The technical controls for USB attacks exist: USB device control policies in your endpoint security platform (Crowdstrike Falcon has USB device control, as does Carbon Black), Group Policy restrictions on removable media, disabling HID device installation via policy. These controls are effective but operationally disruptive — legitimate USB use cases (keyboards, mice, presentation clickers) need carve-outs. The controls also require that the device be managed. A visitor's laptop, a contractor's personal workstation, any device without your EDR agent — those are uncovered.
The physical control is simpler: USB port blockers, epoxy in unused ports for high-security workstations, workstation placement policies that limit physical access to unattended machines. These are low-tech and effective in high-security areas. A visitor who can't get close enough to a sensitive workstation to plug something in doesn't need a USB policy to stop them.
The PACS-IAM Convergence Gap
In most organizations, Physical Access Control Systems and logical Identity and Access Management live in separate organizational silos. PACS is owned by facilities or security operations. IAM is owned by IT. They may share identity data to some degree — provisioning a badge when someone's AD account is created — but the lifecycle management, the access governance, and the incident response processes are separate.
The security consequences of this gap are concrete:
Terminated employees with active badges. Your IAM process correctly disables the AD account and revokes SSO sessions within hours of a termination. Your PACS deprovisioning requires a manual request to a separate facilities system with a different ticketing process and a 24-48 hour SLA. For that window, the former employee has no logical access but can still walk into the building. This is documented in workplace violence prevention literature as well as security incident case studies.
Access reviews that exclude physical access. Your IGA certification campaign reviews application entitlements. Nobody is reviewing building and floor access for whether it's still appropriate. An employee who moved to a different department two years ago and no longer needs access to the R&D floor still has it because nobody's certification campaign covered building access.
PACS events not in SIEM. Your SIEM has no visibility into badge access events. An attacker who has cloned a badge, or a malicious insider accessing areas outside their normal pattern, produces no detectable signal in your logical security monitoring. Physical access anomalies and logical access anomalies should be correlated — "this service account was used from an IP on the 4th floor at 2am while the badge associated with this user's identity shows them as not having badged into the building today" is a meaningful detection.
Lenel, Software House, Genetec, and other PACS platforms have APIs. Your IAM platform almost certainly has an integration framework. The convergence work is doable. It requires someone deciding that it's worth doing, which requires someone framing the risk in terms that motivate a cross-functional initiative.
Social Engineering at the Physical Layer
Physical penetration testing — also called physical red teaming — tests the full chain of physical security controls, typically through social engineering rather than technical attacks. The practitioner community (Deviant Ollam, the authors of "The Art of Intrusion," physical red teamers at most major consulting firms) has documented the techniques extensively.
Common pretexting scenarios: HVAC or facilities contractor responding to a work order (in a high-vis vest with a clipboard, almost universally effective), IT support technician responding to a reported outage, delivery personnel with a package requiring a signature from someone inside, vendor representative for an equipment vendor with a legitimate reason to be near server infrastructure.
What these scenarios exploit isn't a specific technical vulnerability. They exploit the gap between written policy and practiced behavior. Your policy says to verify vendor identity against a pre-scheduled appointment. The receptionist, dealing with six other tasks and a social norm against being rude to someone in a contractor uniform, waves people through and calls upstairs to verify after the fact, if at all.
Physical security awareness training that goes beyond "don't tailgate" — that gives employees a mental model for how pretexting attacks work, a specific script for challenging unknown visitors, and organizational air cover for doing so — is one of the highest-ROI investments in this domain. The technology stops sophisticated attackers. Training stops the most common ones.
The Data Center Physical Security Standard
Tier III and Tier IV data centers under the Uptime Institute standard have rigorous physical security requirements. If your organization co-locates in a reputable facility, you're benefiting from that investment. Multi-factor physical authentication (badge plus PIN or badge plus biometric), 24x7 on-site security, mantrap entry, 90-day camera retention, cage-level access controls — these are table stakes at serious colocation facilities.
Your own offices, branch locations, and on-premises closets are almost certainly not at this standard. The question is whether they need to be, based on what they contain and what access to them enables.
A network closet in a branch office that contains a managed switch and a patch panel is a different risk profile than a server room containing authentication infrastructure, secrets management systems, and backup media. The former needs reasonable controls — locked door, camera, alarm. The latter warrants the full treatment: access logging, mantrap consideration, clean-desk requirements for printed materials, cable lock or physical anchor for any removable media.
Do a physical inventory of what your physical spaces contain, what logical access their compromise would enable, and what controls are currently in place. Map the gaps. Most organizations discover that one or two high-priority locations need significantly more investment and several locations need minor adjustments. The server room door prop-opened with a fire extinguisher is usually fixable without a major capital investment — it needs a facilities work order and someone with enough authority to make it a priority.
Physical Security Is Not Facilities' Problem Alone
The organizational separation between physical security and cybersecurity is a legacy of how these disciplines evolved. Physical security came from the guard-and-gate tradition. Cybersecurity came from network operations and software development. They developed separate professional communities, separate frameworks, separate reporting structures.
That separation is increasingly untenable. An attacker who tailgates into your office and plugs a Bash Bunny into a workstation is executing a cyber attack through a physical vector. The incident response for that event involves both disciplines. The controls that would have prevented it span both disciplines. Treating them as separate domains means each discipline has incomplete visibility into the attack surface.
The convergence of PACS and IAM, physical red teaming as part of the security assessment program, PACS event data in SIEM, and physical security controls in the same risk register as logical security controls — these are the operational indicators that an organization is treating physical security as the prerequisite for cybersecurity that it actually is.
Your six-figure SIEM logs everything. Your EDR catches every suspicious process. Your network detection and response sees every unusual connection. And none of it matters if someone walked in behind a delivery truck and is currently sitting in your server room with a laptop on a KVM.




