Welcome to an insightful conversation with Dominic Jainy, an IT professional renowned for his expertise in artificial intelligence, machine learning, and blockchain. With a passion for applying cutting-edge technologies across industries, Dominic brings a unique perspective to the often-overlooked human side of cybersecurity. In this interview, we dive into the evolving attitudes toward user responsibility in security breaches, the psychological factors influencing online safety, the disconnect in training approaches, and the delicate balance between security and usability in system design. Join us as we explore how understanding human behavior can transform the way we approach cybersecurity challenges.
How has the perception of users as the “weakest link” in cybersecurity evolved, and why do you think this shift in mindset is important?
I think we’ve come a long way from the days when users were routinely blamed for security failures. A decade ago, it was common to hear that users were the problem—too careless or uninformed. But that narrative is shifting, and I believe it’s a critical change. Blaming users oversimplifies the issue and ignores systemic flaws. If a system is so complex or poorly designed that people have to create workarounds just to get their jobs done, the fault lies with the design, not the individual. This shift in mindset encourages us to build better, more intuitive tools that support users rather than frustrate them.
What role does psychology play in understanding and addressing cybersecurity risks?
Psychology is at the heart of cybersecurity because, ultimately, it’s about people. We can develop the most advanced tech, but if we don’t understand how humans interact with it, we’re missing half the equation. People’s emotions, stress levels, and even their trust in technology influence their decisions online. For instance, someone might click a phishing link not because they’re careless, but because they’re under pressure and the email looks urgent. By studying these behaviors, we can design interventions—like better warning systems or training—that account for human nature and reduce vulnerabilities.
How can a deeper focus on human behavior help combat threats like social engineering?
Social engineering exploits our natural tendencies—trust, curiosity, or fear of missing out. When we understand these triggers, we can build defenses that align with how people think. For example, instead of just warning against suspicious emails, we can teach people to recognize emotional manipulation tactics, like urgency or authority. It’s also about creating environments where people feel safe to report mistakes without fear of blame. If someone knows they won’t be shamed for admitting they clicked a bad link, they’re more likely to come forward, and we can address the issue faster.
Why do you think there’s often a gap between what cybersecurity professionals believe is effective training and what users actually prefer?
There’s a real disconnect here, often because professionals prioritize engagement tools like gamification, assuming they’re more interactive and memorable. But users frequently prefer straightforward methods like videos or written guides that they can access on their own terms. I think this gap exists because we sometimes forget to ask users what works for them. We assume we know best, but if someone is juggling a busy workload, a quick video might feel less intrusive than a game that demands active participation. It’s a reminder that user input is crucial in designing effective programs.
How can organizations make cybersecurity training more engaging when many employees feel they already know enough or are too busy to participate?
This is a tough one because perceived knowledge and time constraints are real barriers. Organizations need to make training relevant and bite-sized. Instead of generic annual sessions, tailor content to specific roles or recent incidents within the company—show employees why this matters to their daily work. Also, integrate training into workflows, like short pop-up tips during system logins. If it feels less like a chore and more like a natural part of their day, people are more likely to engage without feeling overwhelmed or dismissive.
Why is striking a balance between security, usability, and functionality so critical in system design?
If you lean too heavily on security at the expense of usability, people will find ways around it, often creating bigger risks. Think of overly complex password policies—users might write them down or reuse them across platforms, defeating the purpose. On the flip side, a highly usable system with no security is just an open door for attackers. Functionality ties it together; the system has to do what it’s supposed to do efficiently. Balancing these three ensures that security enhances the user experience rather than hinders it, reducing the temptation for risky workarounds.
What are your thoughts on the rapid adoption of technologies like AI and the cybersecurity risks that come with it?
AI is a game-changer, but its speed of adoption is outpacing our ability to secure it. People are using AI tools for efficiency, often without considering the privacy or security implications—like inputting sensitive data into models with unclear data-handling practices. This creates new vulnerabilities, from data leaks to AI-generated phishing content that’s harder to spot. We need to educate users on responsible usage while pushing developers to embed strong guardrails in these tools. It’s a shared responsibility to ensure AI doesn’t become a massive security blind spot.
What advice do you have for our readers on staying safe online in an era of evolving digital threats?
My biggest piece of advice is to stay curious and cautious. Technology moves fast, and so do the threats. Take a moment to question anything that feels off—whether it’s an email, a new app, or a request for information. Keep learning about the tools you use; even a basic understanding of how they work can help you spot risks. And don’t hesitate to ask for help if something doesn’t seem right. Cybersecurity isn’t just about tech—it’s about building habits that protect you in a constantly changing landscape.