02.01.2015 Views

Libro de Resúmenes /Proceedings - Sistema Universitario Ana G ...

Libro de Resúmenes /Proceedings - Sistema Universitario Ana G ...

Libro de Resúmenes /Proceedings - Sistema Universitario Ana G ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

at least one loop finite path, then there are infinitely many finite paths. In this case,<br />

it is hard to give a finite (discrete) measure on F P (G). To avoid such case, we <strong>de</strong>fine<br />

the diagram measure ∆ on F P (G). Also, give the <strong>de</strong>gree measure d on V (G). Then,<br />

on G, we can <strong>de</strong>fine the boun<strong>de</strong>d measure µ = d ∪ ∆. We will observe the measure<br />

theory on the measurable space (G, µ). Simiarly, we will also consi<strong>de</strong>r the shadowed<br />

graph measure space (Gˆ, µ Gˆ), where Gˆ = G ∪ G −1 is the shadowed graph of G.<br />

We observe the graph integrals of graph measurable functions with respect to the<br />

shadowed graph measure µ Gˆ. Such graph measuring is an invariant on finite directed<br />

graphs. Finally, we briefly introduce the graph von Neumann algebra M G .<br />

Using Rough Sets theory in KDD methods<br />

Frida R. Coaquira Nina 13 and Edgar Acuña, University of Puerto Rico at Mayagüez.<br />

Rough Sets Theory was introduced by Z. Pawlak (1982) as a mathematical tool for<br />

data analysis. Rough sets have many applications in the field of Knowledge Discovery,<br />

some of them to feature selection, discover <strong>de</strong>cision rules, making data reduction, and<br />

to the discretization of continuous features.<br />

The theory can be used when the dataset has irrelevant (dispensable) features that<br />

can be eliminated, reducing in this way the dimension of the problem and finding<br />

subsets of relevant (indispensable) features. By combining Rough Set Theory with<br />

the usual feature selection methods, we obtain an algorithm like a wrapper method for<br />

feature selection. The principal i<strong>de</strong>a is to recognize the dispensable and indispensable<br />

features, using the discernibility relation across the dataset.<br />

Versions of the probability centrifuge algorithm<br />

Dennis G. Collins, Department of Mathematics, University of Puerto Rico at Mayagüez.<br />

Different versions of the author’s probability centrifuge algorithm are presented with<br />

examples. These algorithms are based on the possibility of moving probability amplitu<strong>de</strong><br />

around without affecting entropy. A radial algorithm is based on one-dimensionalized<br />

entropy, and <strong>de</strong>pen<strong>de</strong>nce on boundary conditions is discussed.<br />

13 frida cn@math.uprm.edu<br />

21

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!