20.02.2024 Aufrufe

IT-Nachwuchsforschung in Österreich

Das OCG Journal ist die Mitgliederzeitschrift der Österreichischen Computer Gesellschaft (OCG). Das erste OCG Journal des Jahres widmet sich erneut der IT-Nachwuchsforschung in Österreich. Ausgewählte Jungforscher*innen präsentieren ihre spannende Arbeit im Bereich der Informatik.

Das OCG Journal ist die Mitgliederzeitschrift der Österreichischen Computer Gesellschaft (OCG). Das erste OCG Journal des Jahres widmet sich erneut der IT-Nachwuchsforschung in Österreich. Ausgewählte Jungforscher*innen präsentieren ihre spannende Arbeit im Bereich der Informatik.

MEHR ANZEIGEN
WENIGER ANZEIGEN

Sie wollen auch ein ePaper? Erhöhen Sie die Reichweite Ihrer Titel.

YUMPU macht aus Druck-PDFs automatisch weboptimierte ePaper, die Google liebt.

1 Dwork, C., Kenthapadi, K., McSherry, F., Mironov, I., & Naor, M. (2006a). Our data, ourselves:<br />

Privacy via distributed noise generation. In Advances <strong>in</strong> Cryptology–EUROCRYPT..<br />

2 badi, M., Chu, A., Goodfellow, I., McMahan, H.B., Mironov, I., Talwar, K., Zhang, L., 2016.<br />

Deep Learn<strong>in</strong>g with Differential Privacy, <strong>in</strong>: Proceed<strong>in</strong>gs of the 2016 ACM SIGSAC Conference<br />

on Computer and Communications Security<br />

3 Anastasia Pustozerova, Jan Baumbach, and Rudolf Mayer. Differentially Private Federated<br />

Learn<strong>in</strong>g: Privacy and Utility Analysis of Output Perturbation and DP-SGD. In 2023<br />

IEEE International Conference on Big Data (Big Data)<br />

Anastasia<br />

Pustozerova<br />

ist Mach<strong>in</strong>e Learn<strong>in</strong>g<br />

Researcher<br />

bei SBA Research<br />

und der Universität<br />

Wien. Ihre Forschungs<strong>in</strong>teressen s<strong>in</strong>d<br />

Privacy-Preserv<strong>in</strong>g Mach<strong>in</strong>e Learn<strong>in</strong>g,<br />

Federated Learn<strong>in</strong>g und Differential<br />

Privacy.<br />

Distributed Comput<strong>in</strong>g and Dynamic Graph Algorithms<br />

von Tijn de Vos<br />

Matrix-Vector Multiplication<br />

<strong>in</strong> Distributed Models<br />

Part of theoretical computer science<br />

is to develop new algorithms with provable<br />

guarantees on their runn<strong>in</strong>g time.<br />

One substantial subfield consists of<br />

graph algorithms. Graphs model several<br />

real-world networks such as road networks<br />

or social networks. The goal is to<br />

efficiently compute properties of graphs<br />

such as f<strong>in</strong>d<strong>in</strong>g the distance between<br />

two po<strong>in</strong>ts <strong>in</strong> a transportation network<br />

(the shortest path problem) or f<strong>in</strong>d<strong>in</strong>g<br />

well-connected nodes <strong>in</strong> a social network<br />

(community detection).<br />

CONTINUOUS OPTIMIZATION<br />

Graphs are discrete objects and problems<br />

are often solved with comb<strong>in</strong>atorial, discrete<br />

algorithms. In another part of computer<br />

science, we have cont<strong>in</strong>uous optimization,<br />

a branch of optimization where<br />

the variables can take cont<strong>in</strong>uous values.<br />

This is opposed to discrete optimization,<br />

where they can only take discrete values.<br />

Surpris<strong>in</strong>gly, <strong>in</strong> 2004 Spielman and Teng<br />

showed <strong>in</strong> their sem<strong>in</strong>al work, that led<br />

them to w<strong>in</strong>n<strong>in</strong>g the Gödel prize, that discrete<br />

graph problems can benefit from<br />

cont<strong>in</strong>uous optimization techniques.<br />

Examples of discrete problems that<br />

benefit from cont<strong>in</strong>uous optimization<br />

techniques <strong>in</strong>clude comput<strong>in</strong>g shortest<br />

paths or comput<strong>in</strong>g the maximum flow.<br />

In the latter, the goal is to send the maximum<br />

amount of flow from a source<br />

to a target through the network. Such<br />

algorithms have been developed <strong>in</strong> the<br />

normal, ‘central’ model of computation<br />

<strong>in</strong> the last 20 years. Recently, it led to the<br />

celebrated near-l<strong>in</strong>ear time algorithm for<br />

maximum flow.<br />

A first example of optimization<br />

techniques is solv<strong>in</strong>g a system of equations.<br />

When these equations relate to the<br />

structure of a graph, we call it a Laplacian<br />

system. With cont<strong>in</strong>uous optimization,<br />

the goal is to solve this (much) faster than<br />

just try<strong>in</strong>g possible values. A more <strong>in</strong>volved<br />

example is solv<strong>in</strong>g l<strong>in</strong>ear programs:<br />

optimiz<strong>in</strong>g an objective function under a<br />

set of l<strong>in</strong>ear constra<strong>in</strong>ts. This generalizes<br />

problems like shortest paths, maximum<br />

match<strong>in</strong>g, and maximum flow.<br />

MATRIX-VECTOR MULTIPLICATI-<br />

ON<br />

It turns out that one of the most important<br />

primitives for solv<strong>in</strong>g optimization<br />

problems is matrix-vector multiplication.<br />

Where matrix multiplication is an (<strong>in</strong>)fa-<br />

26 OCG Journal | 01 • 2024

Hurra! Ihre Datei wurde hochgeladen und ist bereit für die Veröffentlichung.

Erfolgreich gespeichert!

Leider ist etwas schief gelaufen!