15.12.2016 Views

OFR_2016_Financial-Stability-Report

OFR_2016_Financial-Stability-Report

OFR_2016_Financial-Stability-Report

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

(blue dotted line) has a constant payoff that doesn’t change with interest<br />

rates. The customized CDS contracts (red and dark blue dotted lines) have<br />

“knock-in” and “knock-out” clauses triggered by changes in interest rates. A<br />

credit event triggers CDS protection. Like a burglar alarm that goes off only<br />

when “armed,” a knock-in clause (or knock-out clause) defines a second<br />

variable that controls whether the CDS’s credit trigger is active (or inactive).<br />

The solid lines show the aggregate payout on a contract that has both<br />

knock-in and knock-out clauses. The purple solid line shows the accurate<br />

payouts with information about those clauses, and the blue solid line shows<br />

what those payouts would look like without that information. Clearly,<br />

the existence of knock-in/knock-out contingencies is critical information.<br />

Unless regulators have a standardized way to collect contract terms about<br />

such clauses, these differences may be ignored. Without information about<br />

knock-in or knock-out clauses, regulators might be led to believe that aggregate<br />

market exposures to a specific change in interest rates are benign (blue<br />

line), when in fact those exposures are highly volatile (red line).<br />

Scalability Challenges<br />

The abundance of data highlights the need for strong data management.<br />

Data management systems and practices are increasingly mismatched to the<br />

scale of the four Vs of “big data”—volume, velocity, variety, and veracity.<br />

Regulators risk being overwhelmed by the increasing volume, arrival rate<br />

(velocity), and variety of data. New data collections will introduce new challenges<br />

of data quality (veracity) (see Flood, Jagadish, and Raschid, <strong>2016</strong>).<br />

New approaches to data management are needed to meet these challenges.<br />

Data volumes have grown exponentially in recent decades (see Figure<br />

72). Legacy processes cannot simply scale up to collect, clean, integrate,<br />

analyze, and share information by using bigger storage and faster processors.<br />

Rather, supervisors and firms need new processes to address the challenges<br />

of big data and to fully leverage the information big data can provide. Data<br />

processes designed for firm-level supervision will face scalability challenges<br />

as they are stretched to monitor the system as a whole.<br />

One example is the need to accurately identify entities across different<br />

sources and data systems to allow supervisors to assemble a picture of the<br />

overall system. However, financial utilities, firms, and regulators use proprietary<br />

identification schemes requiring processes to align entity data across<br />

these systems. This situation is a big-data problem, because, without a coordination<br />

mechanism, the number of alignments to manage grows much<br />

faster than the number of identifier sets involved. Rather than continually<br />

scaling legacy processes to link different identification schemes, a different<br />

approach, such as the LEI, would better address the scalability challenge.<br />

The LEI defines each legal entity only once, facilitating data-quality management<br />

by eliminating different methods of referring to the same entity.<br />

Key Threats to <strong>Financial</strong> <strong>Stability</strong> 89

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!