Views
1 month ago

Inside NIRMA - Spring March 2018 Issue

A Retrospective on

A Retrospective on Information Management in Nuclear Power By Eugene Y. Yang, Principal Consultant, KISMET Consulting, Inc. his column takes a look back on information management (data, documents, and records) in the nuclear power industry. I have been fortunate to either be employed by or consulted to many of the utilities and power plants in the U.S., seeing where things were and how they evolved over the past 35+ plus years. The plan is to make this a regular column in the Inside NIRMA magazine. You know the story that you tell your children on how tough you had it growing up (“I had to walk five miles to school everyday, in the snow, winds howling, wind chill in minus 20’s…even in the spring… and it was uphill there and back!”)? You think you have it rough today? Scanning documents at 100 ppm, processing “born-digital” documents and records, uploading them into an electronic repository, so that you can view them in less than five seconds in a web browser, smartphone, or tablet (“Five seconds! Man, that’s SLOW.”). And then there are the times when the technology folks sound like a John Wayne movie (“We have to figure out how to build a cyber defense so Apache Tomcats don’t take our Red Hat!”). Well, back in the day, at the start of my career, all this processing stuff was paper-based. I found myself helping the plant folks navigate the nascent use of computers in data, document and records (“…in howling winds, freezing temperatures.” Hmmm. Actually, for plants under construction, it was the truth!). My first job in the nuclear power industry, in 1983, was with a southernbased utility. Computerization occurred with mainframes and minicomputers, accessed through monochrome or color terminals. Even then, however, there was a need to envision integrated systems, linking plant control systems, management systems, and administrative systems. I had the opportunity to cut my teeth on “information systems architecture” – conceptualizing business and system architectures that sought to provide the path forward from current implementations to holistic integration. Back then, it was a fundamentally datadriven exercise; paper-based records were stored on shelves, in banker boxes, file cabinets, desks, floors, etc. At that time, there were three stations: one in construction, another in startup, and the third one in operation. For a young engineer in IT, it was great to be able to get into the battles of mainframe vs mini, integration vs. standalone, and plant vs. plant. Three Mile Island caused the industry to address the need to have accurate record indexes, available to a wide audience, accessible in near realtime, and have it redundantly stored. One of the interesting reactions to these requirements was the use of Tandem Non-Stop systems, fault-tolerant computer systems (used for ATM networks, banks, stock exchanges, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss). The thinking was that Tandem computers provided that redundant, “we’re always going to be up” that would allow access to records in case of another TMI incident. Did you know there used to be word-processing pools? Wordprocessing computer equipment (think Wang; IBM Displaywriter) was so expensive, you could only justify having them by centralizing the resources. We would type up our drafts in the mainframe-based terminal text editor, print them out, then hand the printout to the word-processing staff to type it. Then we would “stet” or otherwise redline in a vicious cycle to get the final document. (Some luddites in our office wrote their stuff out WITH A PEN, and then handed it to word-processing. Hah. I was “modern”.) But, then the emergence of the microcomputer. I was an early adopter of the Apple II+ and educated my way through spreadsheets using VisiCalc. At the office, we got our first IBM PC, shared among our section of 16 people. It had 256K RAM, a 5 ¼ “ floppy drive, and a whopping 10 Mbytes of awesome hard drive. Pretty much processed words and spread sheeted budgets with that puppy. Later, at another position, I actually had in my desk my own Iomega 10 Mbyte cartridge disk (think Banquet fried chicken dinner packaging…hmm, hungry…). I was being “efficient” by not clogging up the drive on the PC. Eugene has been a member of NIRMA for over 32 years. At the time he joined, NIRMA had only been in existence for 10 years. He would love to hear about the early days from others, so please email stories and anecdotes to him at eugene.yang@kismetconsulting.com. 6 Spring 2018 NIRMA.org Inside NIRMA

SavantX Delivers on the Nuclear Promise By Ed Heinbockel, President/CEO, SavantX, Inc. rtificial Intelligence (A.I.) and Machine Learning (ML) are increasingly transforming business intelligence. The development of the SavantX Platform and advanced analytics spans a number of years and two successful pilots at Diablo Canyon Nuclear Power Plant. The Platform was optimized using station data for the volume and variety of unstructured nuclear data with worker input and guidance using the tool to solve outagerelated challenges. “If we had used the old way, it would have taken us days or weeks, and we would not have seen hidden trends.” Diablo Canyon Nuclear Power Plant Senior Engineer Station data coupled with SavantX A.I. helps Deliver on The Nuclear Promise by making all data easily retrievable, revealing relationships that point to safety and efficiency improvements, and significantly improving processes, not just automating processes. Swimming in a Sea of Data SavantX’s lineage includes open source intelligence gathering tools for the Defense Intelligence Agency. So, the base technology was an obvious fit to help nuclear power stations swimming in unstructured data locked away in many structured databases find what they needed to safely, efficiently and effectively operate. This operational experience demanded the development of new and novel technologies. For instance, the platform had to be selfhealing and autonomous in its learning and adapting to new data without dedicated personnel keeping the platform operational and current with the latest data. The SavantX platform offers state-of-the-art data visualization as well as a new take on ML for highly scalable A.I. apps. SavantX is focused on solving real-world problems through smart A.I. and autonomous, higher dimensional ML. See SavantX on page 15. Inside NIRMA NIRMA.org Spring 2018 7