As new technology emerges at a rapid pace, so does the threat of cyber attacks on pipelines. Will Gage, an expert in ICS cybersecurity, joins the Pipeliners Podcast to discuss the two-pronged challenge of (a) ensuring that operators are properly trained on how to use technology and (b) preventing attackers from taking over operations.
Will also takes a critical look at the technology we all use on a regular basis and the potential vulnerabilities we are not even aware of when using these devices. Be sure to listen to this fascinating discussion of ICS cybersecurity.
Show Notes, Links, and Insider Terms
- Will Gage is the supervisor of SCADA infrastructure and cybersecurity for pipeline transportation company Enterprise Products Partners. Will is also on the Board of Directors for the Energy Telecommunication Community.
- Find and Connect with Will on LinkedIn.
- The ICS (Industrial Control System) Cyber Security Conference is a gathering of ICS cybersecurity stakeholders across various industries — energy, utility, chemical, transportation, and manufacturing — focused on discussing solutions and protection strategies for critical infrastructure organizations.
- The NIST Framework, which was created through a collaboration between industry and government stakeholders, consists of standards, guidelines, and practices to promote the protection of critical infrastructure. The prioritized, flexible, repeatable, and cost-effective approach of the Framework helps owners and operators of critical infrastructure to manage cybersecurity-related risk.
- The SANS Institute focuses on strengthening the cybersecurity of ICS. Their initiative is equipping security professionals and control system engineers with the security awareness, work-specific knowledge, and hands-on technical skills they need to secure automation and control system technology.
- The API (American Petroleum Institute) is a national trade association that represents all aspects of America’s oil and natural gas industry.
- The INGAA (Interstate Natural Gas Association of America) is a trade organization that advocates regulatory and legislative positions of importance to the natural gas pipeline industry in North America.
- Infoguard is a partnership between the FBI and the Public Domain, where the FBI shares threat intelligence with the public, but only with approved individuals who pass screening.
- The ICS-CERT (Industrial Control Systems Cyber Emergency Response Team) works to reduce risks within and across all critical infrastructure sectors by partnering with law enforcement agencies and the intelligence community and coordinating efforts among Federal, state, local, and tribal governments and control systems owners, operators, and vendors. Additionally, ICS-CERT collaborates with international and private sector Computer Emergency Response Teams (CERTs) to share control systems-related security incidents and mitigation measures.
- Clint Bodungen is the VP of ICS Cyber Security, a published author, risk analyst, and expert in cybersecurity.
- ISA works to secure control systems using ISA Secure components and systems from the ISA Security Compliance Institute (ISCI), and has developed the most comprehensive standard, ISA99, which has now become the global industrial cybersecurity standard IEC 62443.
- The NERC CIP (North American Electric Reliability Corporation – Critical Infrastructure Protection) is a plan or set of requirements designed to secure the assets required for operating North America’s bulk electricity system.
- API 1164 is a document that outlines SCADA cybersecurity processes that could take years to implement correctly depending on the complexity of the SCADA system.
- ISO (International Organization for Standardization) brings together international bodies to generate standardized policies and procedures for international trade and interaction. ISO/IEC 27032:2012 provides security guidelines for cybersecurity.
- IEEE (Institute of Electrical and Electronics Engineers) is an association of technical professionals whose objectives are the educational and technical advancement of electrical and electronic engineering, telecommunications, computer engineering and allied disciplines.
- The CNSSI (Committee on National Security Systems) is an inter-government organization consisting of 21 branches that sets policy for the security of U.S. security systems.
- NSS (Network Security Services) is a set of libraries that supports the development of security-enabled client and server applications across multiple platforms. NSS also provides open-source implementation of cryptographic libraries.
- The NRC (Nuclear Regulatory Commission) licenses and regulates U.S. civilian use of radioactive materials to protect public health, safety, and the environment.
- SP 800 refers to the Special Publications series released by the NIST that reports on the Information Technology Laboratory’s (ITL) research, guidelines, and outreach efforts in information system security. The goal is to protect controlled unclassified information in non-Federal information systems and organizations.
- DMZ refers to a subnetwork that presents an organization’s external-facing services to the Internet, requiring an additional layer of security for the organization’s LAN. This firewalls the organization’s private network and only exposes public information through the DMZ.
- Finally, a light look at cybersecurity from this classic “Suki” commercial.
Russel Treat: This is Russel Treat, and this is the “Pipeliners Podcast,” episode number four.
Announcer: The Pipeliners Podcast, where professionals, Bubba geeks, and industry insiders share their knowledge and experience about technology, projects, and pipeline operations. Now, your host, Russel Treat.
Russel: Thanks for listening. Want you to know we really appreciate you taking the time out to listen to this episode and to show our appreciation, we will let you know about our prize pack.
We are offering a free stainless steel YETI tumbler to one listener every episode. Here’s how you register to win. Simply visit pipelinerspodcast.com/win, that’s pipelinerspodcast.com slash the word “win,” to enter yourself in the drawing. It’s our way of saying thank you for supporting the show.
We’re very fortunate today to have William Gage as our guest. I had the distinct privilege of being able to serve with Will on the board of directors for the ENTELEC conference, got to know him, and had the opportunity to cut him over to the side one day and say, “Hey, what it is that you do for a living?”
Frankly, I got to tell you, Will’s got a really interesting background, particularly in terms of cybersecurity, SCADA, and all that. With that, let me introduce you to Will Gage. Welcome to the Pipeliners Podcast.
William Gage: Russel, thanks. I appreciate the opportunity to be here.
Russel: Listen, I appreciate you participating and being one of our early guests. Why don’t you share with the listeners how you came to get interested in cybersecurity?
William: Let me just start off by saying, fortunate to grow in a household with a computer early on. When I was able to start reading, I was learning how to do NS/DOS, Windows 3.1. That dates me a little bit.
Getting in through the middle school and the high school, I had quite the active early career in computer systems networking and things of that nature, but it really wasn’t until almost out of college.
Facebook, the wonderful Mark Zuckerberg invention, brainchild, I signed up and was using a little bit. This was right after college. I was like, “This is interesting. All of a sudden, stuff’s showing up on my wallboard. That’s not supposed to do that.”
Come to find out your early, early Facebook attack, so I shut down my account and was like, “That’s not supposed to happen.” I guess I have to thank Mr. Zuckerberg for him launching me into cybersecurity.
From that, I’ve been working in the industry, oil and gas, doing a myriad of different jobs, everything from installing meters out on runs, setting up vibration pressure sensors on compressors and various other equipment, all the way into actually changing out SCADA systems, doing help desk email, the full nine yards, networking, communications. I’ve had my hands in a lot of things.
Early on, taking the concept of my Facebook exploration to heart is teaching others, “We’re all manipulative in some form and manner.” What I mean by that is the social engineering sphere. I’ve used that a lot and, I think, fairly successful in my career thus far.
Now, I work for Enterprise Products Partners in Houston, Texas. We’re a very large pipeline transportation company. We’ve got about 50,000 miles of pipeline.
My group – our responsibilities are keeping the SCADA system running, always having the uptime and availability for our pipeline controllers to operate that safely, first, and then, secondly, to meet our business needs.
That’s a lot of responsibility, and quite a bit of that goes in with the cybersecurity perspective, managing all these users. How do we work with all these people? You have people with basically no computer experience whatsoever all the way up to people that work side by side our department that are quite good at computers. They’re typically the most dangerous from a cyber perspective.
Russel: William, one of the things that I do is I teach a SCADA fundamentals class. When you look at the things that cause system failures in the SCADA world, almost all of them, or the lion’s share of them, certainly, are related to human intervention.
Often, it’s people doing routine maintenance or jobs they need to do. Something happens. Something goes wrong. They get something wrong. In other words, it’s human error or human intervention that often causes SCADA failure.
Having the opportunity to read some of the intelligence in the cybersecurity about what’s going on in that domain, just in the last few years, it seems like there’s been a significant increase in the activity in the cyber domain, people just being more aggressive, and starting to actually be more aggressive in the industrial controls area. Are you seeing that in what you’re doing?
William: Absolutely. Well, let me back up. I’ll say, “Yes I do.” However, at least in my world, we have a lot of visibility because of the things that we put together, but I would say across the entire sector, SCADA and ICS vulnerabilities are drastically up.
Just the numbers — and I have some of these memorized — if you think about an 85 percent increase in the vulnerabilities disclosed from 2010 — I think it’s 2009, 2010 into 2012, you went from somewhere in the high 20s, to almost, I think it was 240 disclosures. Definitely seeing the trend of, yes, there is a lot of confirmations and eye openers that are happening across the sector.
I mean sector, ICS Electrical Wastewater, a myriad of other sectors that utilizes this, the same world that we live in. There are brothers and sisters over there having just as many problems. I definitely think that with those disclosures, we’ve been quite — and I try not to jinx ourselves — but we’ve been quite good about being, almost, in certain areas, ahead of the ballgame.
William: Now saying that, I understand that there are areas where we’re still lacking. I know that the entire idea of the NIST Framework was to help incentivize. You can have a benchmark, but again, I think that it comes into not setting a standard per se. I think it’s everybody getting on the same page.
Russel: Yeah, just for the listeners, I know that… and a lot of these things, when you start talking about technology, this can get fairly jargony, if you will… ICS is a common phrase. You see it used a lot more in the process industries than in pipeline, but it refers to the industrial control system.
In the pipeline world, that’s everything from the meter stations, or compressor stations, or pump stations, all the way back through the computer systems, and systems used to operate the pipeline.
You also mentioned the NIST Framework. Can you tell us a little bit about what is the NIST Framework, and why should I care as a pipeline operator?
William: NIST is a great tool, as I call it. It’s a toolbox. So, you have 800, you’ve got a few pieces in there…53 is the big one that I like to look at. It all comes to this, what are you doing from a cybersecurity standardization?
William: If I go and look, and it’s the best of the best, as I like to call it. You go through and you’re like, “Okay, this is the best practice to do this.” That might be setting up a sandwich DMZ for controls, and delegated responsibilities, privilege.
It ties in, I think, and what’s so good, is for years we have had SANS and an API and INGAA and a number of other industry leaders that said, “You know, we’re going to come out and make posture, and kind of help everyone kind of reach a level of security that we feel that we all should be doing together, since we all share the same data, since we’re all connected together.”
This basically takes it to the national level.
Russel: I’m certainly no expert in cybersecurity, but I know enough about it, because being a company that offers SCADA solutions, and control solutions to pipelines, we need to be knowledgeable about that.
One of the things that we participate in, or I participate in actually, is Infoguard, which is a partnership between FBI and the Public Domain, where the FBI shares threat intelligence with the public. They have a fairly rigorous process of vetting people to join. I think you’re probably a member of that as well. Do you find that useful?
William: I find that useful. Also, more interestingly, I really believe that that sector of cybersecurity is really, really postured towards our friends over in IT, and the Internet facing domains, and especially in financial systems, I think is the best.
I prefer my friends over at ICS CERT. I like to talk to them frequently in the week and keep up with things there because I believe…
Even though I know this information gets disclosed through all these various government alphabet departments, I find where my relationship with the folks at ICS CERT or even TSA, is a great partner from a pipeline transportation company.
I think that they’re very keenly aware of what we’re going through, of course. They have that understanding of what it is we do and why it’s so important.
Russel: I think that’s well stated. Again, you’re using a lot of jargon and I get it. I know what you’re talking about. Some of the people listening to this might not.
Russel: No, no, no, it’s fine. What we’ll do is after the show, we’ll make sure that we take all these buzzwords and organizations and such and we’ll link them all up on the show notes page.
If this is something you want to know more about, you can go to the podcast website and the show notes for this episode and like Will said, the [laughs] alphabet soup that we’re talking about here.
The other thing that I know about cybersecurity is it’s not just cybersecurity. You mentioned something earlier that I think is probably a topic that’s important to talk about and it is really much broader than pipelining, and that’s this idea of social engineering.
What is social engineering and what would I need to be aware of to understand and be prepared for that kind of security threat?
William: Yeah, this is one of my favorites. I have a good remembrance of thinking back on what really is social engineering and what are forms of it?
I love social engineering. I like to tell the story of the Battle of Troy, right? What got built? This big old freaking horse that was hollow and was holding troops, and then they brought it and they’re like, “Oh, here’s a gift,” and then everybody comes out.
They social engineered that. They thought it was a present. It’s exactly the same today. Various people are using methodologies engineering their way into you trusting them. That’s really all it is.
You could do that from a physical perspective, you could do it from a cyber perspective, and I think that the two of those typically are the ones that I find working together are successfully attributed to a number of the issues that we’ve had in the past with cybersecurity breaches.
From a perspective of “How do I protect against social engineering?” is always train the people that work with you and for you. I can’t say that enough and I say it again. Train the people that work for you and with you because you need to take the opportunity and realize not everybody knows everything.
I don’t even know everything. I am learning every single minute of the day, reading through material, going in and getting into places where I probably shouldn’t say, and just absorbing the information as much as I can.
From that, then I take…I’m sharing with our SCADA folks, with my people, “Hey, think about emails.” Oh, man. I got on and was on Amazon and all of a sudden, I get this email maybe a day later or something just by coincidence. It says, “Hey, your package is arriving. Go click on this link and you can track your package.”
I’m thinking, “Oh, yeah. I was there and I’m going to go do that.” Immediately, I click on that and because I’m used to shopping at Amazon and I didn’t see anything wrong with that, once I click on that, I’ve immediately given that person trust.
With that trust, then I could download packages that could create a command and control, which means that now I have an opportunity to control this machine and I can use this as a hopping point.
It’s like a beachhead in the military. I think about the Marines a lot. They storm the beach and they get everything secure here and they’re good and then from that point, they move forward. Each of those is a hopping point.
Russel: We do a training class in SCADA fundamentals and the last day of that class is on cybersecurity. A good friend of mine by the name of Clint Bodungen teaches a big part of it. Clint, he’s a cybersecurity guru. What he loves to do is sit in a lab and figure out how to hack.
William: Yeah, I know Clint real well and I agree. That’s his passion.
Russel: For years, people said that, “There’s no way…We use Modbus. Nobody’s going to be able to hack Modbus.” He downloaded some stuff off the Internet in a conference.
Over the course of 30 minutes, he did a man in the middle attack by capturing Modbus messages and spoofing a host — spoofing meaning tricking a host to thinking I’m real and was able to overflow tanks while telling the host that everything’s good.
William: You know what’s funny about that?
Russel: I have no idea. [laughs]
William: I was listening. What’s funny about that is I have that all on video.
Russel: Yeah, I know he made a YouTube video. It’s pretty damn compelling…
William: Pretty funny.
Russel: …particularly for people that think that this hacking stuff is hard. One of the things is it’s not really.
Russel: I remember…Gosh, I wish I could find it. I’m going to have to look for this, see if I can link it up. There was a video of some guys sitting in an auto plant and the robots are going nuts.
They’re painting something on the side of the car and then it cuts to a little kid who’s on a laptop and watching this going on in a plant. Her mom says, “Suki, come to dinner,” and “Suki” is what’s being painted on the side of cars. [laughs]
William: Oh, yes. Yes.
Russel: The funny thing about this is that it’s not like that is impossible to do. Anyways, yeah, I think it’s interesting. The other thing about Clint and in this whole domain and to talk a little bit more about social engineering is one of the things that Clint used to do… He doesn’t like doing this kind of work, but it’s pretty lucrative and it’s one of the ways he kind of got anchored in the business. He did red team work.
I’m sure you know what that is. Our listeners might not. Red team is where you hire somebody to hack your systems and get control of your control systems. He was engaged by some large companies. If I were to put their names out, you would recognize those names.
I often ask people, and I’ll ask you William, how often do you think he was successful getting 100 percent access to a critical control system?
William: [laughs] 110 percent.
Russel: [laughs] Yeah. That’s exactly right. The answer is 100 percent, so the issue was never, could he? The issue was always, what did it take?
When it was hard it always involved some kind of social engineering, either directly with people, where I managed to get into the control room when I shouldn’t be in the control room and shove a thumb drive into a workstation, or whether some other kind of process.
When you combine creative social engineering with creative programming… cybersecurity is just like putting locks on the door of your house, it keeps the honest people out. If somebody is dishonest and they want in your house, they’re going to get in.
If you put locks on the door and do things that are appropriate then they’re probably not getting in. Kind of the same thing I think.
William: No, I agree with that. It’s interesting. A quick tangent on that is, realizing that when I first started getting in and going, “Oh man, these flash drives,” I call them jump drives and flash drives, or like the cat’s meow, right? Then about three months into it, one day I put an executable on there and I ran it off the USB, and I didn’t think about it until I went, “Oh crap.”
William: Then I was like, “USBs are from Satan.”
William: Now it’s another attack vector and you’re like, “How do I save myself?”
Russel: Exactly, so I’m going to disable all the USB ports on all my computers so I don’t have to worry about thumb drives, right?
William: Yeah, and I know people that put hot glue in them. [laughs]
Russel: Yeah, I mean in things that have access to critical control systems a lot of times that makes sense, and yet people that have to work on those… It’s like a lot of other things, when you know enough about it, it starts getting really complex and fairly difficult.
I think if more people read the intelligence of the threat, and what’s going on, and the kind of things that are occurring, there would be less resistance to complex passwords, and difficulty in accessing control systems, and all that kind of stuff. I know that all that creates obstacles for doing our daily work, but they’re important obstacles.
William: Yeah, absolutely. We always have a saying, or maybe I have the saying, and that is, cybersecurity really does need to be the KISS mentality, but more importantly, it’s always about availability. That’s where on the opposite side of the wall — and I call it the opposite side of the wall — from our brothers in IT, is our triangle is a little bit different.
We’re always keen to make sure that the system is available. It’s always a safety thing for us, and availability helps that a lot. Then we are concerned then about the integrity of the data. Then we’re concerned about the security.
With that we always have to think in the back of our minds, “As long as I’m not going to create an opportunity to take the system down, and the data is good, then I can do this work.” In our area it really kind of keeps things fairly straightforward. We don’t have a lot of jumps and things we have to worry about, but on the flip side of that it does have some exposure, it does have some risk.
If you’re looking at it from multiple attack vectors, if you’re looking at it across the scale, there is a solution for each of those pieces that can simplify your life.
Patching, things of that nature, well, yeah, you might have to do some things manually to get them over, but there are also areas in the world where we’re like, “Yeah, we need to do some automation work because we got so many machines and we know that we can do these things without affecting our availability.”
Russel: I haven’t heard it put that way. I think that’s actually really important, because one of the things unique about control systems with critical infrastructure like pipelines is the availability, because they’re absolutely necessary to protect the people, to protect the environment, to ensure safe and continual delivery of product. It’s a big deal.
These systems have to be up, they have to be available. If they’re not there can be pretty significant consequences. Before we leave this broader topic of cybersecurity I do want to ask you about one other thing. I think it’d be appropriate to talk a little bit about standards.
There are a ton of cybersecurity standards that are out there: ISA, there is NRC CIP, you’ve got the NIST Framework, API 1164, and on, and on, and on. How do you approach that kind of standards soup, if you will, in order to figure out how to do a program that’s, like you said, keeping it simple?
William: Oh my gosh.
Russel: How much time do we have to talk about this, right?
William: Yeah, I know, right? That’s loaded. For everybody listening, one of my favorite things to do is I like to… I shoot these out all the time to people, so ISA, ISO, IEEE, cryptographer protection, you’ve got APL at a 64, BSI 100 and 134, you’ve got CNSSI, you’ve got NSS. There is NRC, there is SP 800, there is…
Russel: You’re scaring me because you know all of this right off the top of your head. You’re worrying me a little bit about what you do with your free time. [laughs]
William: Yeah, we won’t talk about that.
William: We’ll talk about it later. From a perspective on standards, I think that really standards are guidelines. What I mean by that is, you have to know when a standard is right for you and the standard is not right for you. I think that everybody goes into this the same way, and that is, “Oh my gosh, it’s a standard. This is what we have to do.”
That’s not the case. Really, you need to have the opportunity to sit down with the standards and really go through them and say, “Yeah, you know what? This all makes sense.” I look at it and I say, really, standards are a set of guidelines that we use to create our own program, and underneath that program our policies, and underneath our policies are our standards.
From those standards we have procedures, and guidelines, and all the other documentation that we need. It’s all set up in that perspective. You need to take the opportunity and utilize all these various tools. They’re tools, they’re nothing more than tools.
Russel: Obviously, Enterprise is a large company and you guys have resources beyond what many of the companies that we work do. We tend to work more with the smaller pipeline operators, smaller midstream guys, the guys who don’t have the resources to have a staff that full time is doing all this type of thing. It’s a real challenge to help them get their brain around it.
Again, I think you make a really good point, Will, that the mindset is, “Well, if we have a standard, we ought to follow it.” Cybersecurity is one of those areas where the standards are ubiquitous, maybe is the right word. [laughs] There is a lot of them, and you got to go through them, you got to rationalize them, and figure out how you’re going to apply them.
It becomes a trade-off between risk management, availability, and complexity. There is actually value to the industry that everybody does that a little bit differently. It actually makes the industry hardened to the nefarious actors, because if everybody does it the same way, then if I figure out one, I’ve figured out everybody.
William: There are two pieces, and I’ll bring you up real quick. That is, best practices aren’t standards. There also I feel like guidelines as well, of course, but they have more, I believe, meat in them than standards do. I know a lot of people would be like, what? That makes no sense. In reality they do, they really get in and talk more about the why and hows, and I think that that’s important.
A great example, SANS 20, those top 20 critical controls. I like that because that’s exactly what you’re saying. So here is my 20 categories, and I’m going to try to find technology that meets those 20 categories. I’m going to have a bunch of different types of technology from different vendors, and I’m going to achieve my… okay, I’m needing that, and I’m building my program off that.
I always like to say this, I’m a sandwich DMZ between control systems and the corporate world. This is just my philosophy, I’ve always thought that, and I feel like it’s a little bit safer. However, I also find that it’s important to think and consider, well, if my two firewalls that are in my sandwich they’re Cisco products, well I didn’t make it any more difficult for a person to hack it.
Because the one person that knows Cisco is going to be able to get through those. Now, if I make one of those like a Juniper product or something else, a Palo Alto. Then I have effectively said, “Guess what? You only know Cisco, you only made it that far, now you got figure out the rest of the world and by the time you do I’m going to be able to throw you out.”
Russel: Now that’s it. Again, that’s really interesting to me. I understand what a sandwich DMZ is. Basically, a sandwich DMZ is an area that I want to keep secure and there is two paths in and out. The Internet maybe being one side and the corporate network being another, and I’m kind of in the middle of those two things.
I’ve also heard of defense and depth where it is I take my network and I firewall it and stack it so that the SCADA servers are firewalled from the radios that are firewalled from the PLCs, that kind of thing. What would you call having different kinds of network geared different kinds of product? Do you cybersecurity dudes have a name for that as a strategy?
William: What I like to call it is, I like to call it, ain’t nobody know everything.
Russel: [laughs] Yeah. You’re using complexity as a tactic.
William: Yeah, differentiation of product. Absolutely. Someone might have the skills to go do one thing, but they don’t have the other.
Russel: Yeah, that again, that’s a really interesting observation. I want to kind of shift gears on you if you will, and talk about something else that’s kind of a bastion of both yours and mine, and that’s what you do with your church. Earlier in the week we were kind of visiting about doing this, about you having a vision for cybersecurity for churches. Why don’t you tell us a little bit about that and what you’re doing there?
William: Absolutely. Being involved with my faith, and my church, and things that I do, on my second job, if you will. I’m always here at the church doing something technology wise and helping out where I can.
I was coming over to the church just to make sure that everything was ready for Sunday and looked over at our announcement machine, which runs and does all the slides for the entire building and everything, and noticed, “Hmm that’s weird, it’s not running.” Pulled it up, the application up, looking at it going, “This doesn’t look right.” Sure enough, I go look at the server and the server had been ransomwared. It was my favorite, because I could look at it and at the top of that directory, right on the main mount was that file that was so clear as day, like “Hey, we are protecting your files. Pay us and we’ll get rid of the bad guy.”
It was my favorite thing to ever see in my life. I got a copy of it and all sorts of stuff. But for a church to be hit like that and you think about, churches don’t have the resources to handle this. Luckily, in my church, they have me and because I can’t be here all the time, we actually have a company that helps us out.
They’re really good at some of this stuff too. Helping with them…It took almost two weeks to get everything back where we need it to be. Because of that, out of that, I’ve realized, “Wow. Something really needs to be available to churches and faith based organizations.”
From that I’ve actually started a little something called White Hat Church. It’s whitehatchurch.com.
I’m getting ready to start a podcast and solution based whitepapers that will help take Pastor Steve, who basically knows how to work his laptop and basically give him three really simple, straightforward things to do to increase your security posture as a user of whatever it is you’re doing.
It’s going to scale up not really becoming more technical, just scaling up on the, what I would like to call, protection level. We start with training staff, to the point then where I can move to the next level and say, “Okay, now that you understand your postures better, now let’s talk about some additional fundamentals and how can we help?”
Right now I’m not focusing on money or anything like that. It’s solely to help give churches resources that they don’t have access to today, either because of financial reasons or just because they don’t know where to go.
I think that it’s important because I know a lot of churches are heavily invested on the cloud and I know in the ICS world, “Oh, the cloud. Oh no.” But the reality is, is Facebook is the cloud. We use it for all sorts of things. Twitter, Instagram, SoundCloud. We can go on and on and on.
My church, we moved quite a few things over to Google Drive so that way we have access to some of those very critical things everywhere we’re at. We don’t have to have nearly the infrastructure.
But with that comes also the responsibility of understanding, “How do I protect the membership, the pastoral staff as well?” My goal with this is really to really help train pastoral staff and tech staff and whomever else. You don’t even have to be on the staff of the church. You just really care about your church and you want to do this for them. Hopefully this will really grow into something.
Russel: I think that’s awesome, Will. I really do. I have connections and fairly active at my church. When you start getting this a little further down the road and you’re looking for some other people to take advantage of it please let me know. I’d love the opportunity to help spread that word. I think that’s awesome.
We’re talking about cybersecurity. I tried to be pretty hard as an individual, in terms of hard to get access to my stuff. I had an issue just earlier this week where somehow somebody got access to Facebook Messenger and started using my account to send emails.
I went through a process and I cleaned it up. I was lucky that some people told me it was going on or I wouldn’t have known, and was able to get that cleaned up pretty quickly.
But Software as a Service is a great idea for things like churches that don’t really want to invest in… not so much not have the infrastructure but don’t want to maintain the infrastructure. They’re moving to this Software as a Service model and they’re becoming more reliant on Facebook and social media to execute their mission. That does create risk. I think that’s awesome.
Look, we’ve kind of come to the end of our time. I’m definitely going to ask you to come back. I know that there’s other subjects that we could talk about in terms of high performance HMI and alarm management and some other things that I think our listeners would be interested in. But I think we’ve probably buried them in enough buzzwords and jargon for one episode.
Russel: Look, again, thank you very much for being our guest and we look forward to having you back. To all the listeners, thank you for listening. Just a reminder that you should go to the Pipeliners Podcast for an opportunity to win a YETI tumbler with the Pipeliners Podcast logo emblazoned on the side.
Simply go to pipelinerspodcast.com/win. That’s pipelinerspodcast.com slash the word “win.” Enter yourself in the drawing. Thanks for listening and we look forward to having you here next time.
Announcer: Share your questions and comments with us at pipelinerspodcast.com. You can support the show by liking and following us on SoundCloud or by rating and reviewing the show on iTunes, Google Play, or Stitcher. Thanks for listening to the Pipeliners Podcast.
Transcription by CastingWords