13.07.2015 Views

WWW/Internet - Portal do Software Público Brasileiro

WWW/Internet - Portal do Software Público Brasileiro

WWW/Internet - Portal do Software Público Brasileiro

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

ISBN: 978-972-8939-25-0 © 2010 IADIS2. The hotlink assignment procedure is repeated until all the desirable changes are completed.3. From the ‘Hotlinks Home’ page the web master automatically inserts those of the suggestedhotlinks approved. The user has the option to specify hotlinks just for visualization andcomparison without adding them. The transformation of the site is completed with this step.4. Now the user can duplicate the stored instance of his site either automatically or by repeatingstep 1 (crawl the altered site and acquire the site’s map with the hotlink additions).5. Finally the web master can activate the first crawl and delete the hotlinks added in step 2.The above procedure will leave the user with both the new and old versions of the web site. Of courseonly the last version exists live on the server. By following the same steps the web master can enrich theinstances-versions of his web site that are stored by ‘Hotlink Visualizer’ thus providing an archive of siteversions which he or she will be able to edit, compare and apply at any time.6. CONCLUSION AND FUTURE WORKIn this paper we attempted an approach on the hotlink assignment problem from a more practical perspective.We proposed a new administrative tool that aims to help the user exploit this field of research which being soimportant, remains relatively unapplied.The ‘Hotlink Visualizer’ leaves numerous options open for expansion and future research. First of all, as atool oriented towards the World Wide Web, its specifications must constantly evolve accordingly in order toavoid any limitations in its use. An interesting prospect for future expansion is the crawling in the hiddenparts of the site or the deep Web as discussed in [8]. The potential of embedding invisible crawl functionalityin a web, administrative tool such as the ‘Hotlink Visualizer’ would broaden massively its perspective andgenerality of use. Strictly from a usability point of view, there could be additions, statistic enhancements an<strong>do</strong>ptimizations, depending on the use and acceptance of the tool and of course the feedback. There are alreadyideas for a more unambiguous grouping of each crawl’s hotlinks. What is even more exciting as a prospectthough is to develop a library of hotlink assignment algorithms, suitably specified as to be applicable to themodel of the web site instances that are acquired by our crawler. Research in this direction would requireconsiderable effort in embedding other hotlink assignment methods in such a library, but will provide thecommunity with a complete hotlink assignment tool that could be a frame of reference for future web siteoptimization research attempts.REFERENCES1. John Garofalakis, Panagiotis Kappos and Dimitris Mourloukos, 1999. Web Site Optimization Using Page Popularity.IEEE <strong>Internet</strong> Computing, July-August 1999.2. John Garofalakis, Panagiotis Kappos and Christos Makris, 2002. Improving the performance of Web access bybridging global ranking with local page popularity metrics. <strong>Internet</strong> Research – Westport then Bradford, volume 12,part 1, pp. 43-54.3. John Garofalakis, Theo<strong>do</strong>ula Giannakoudi, Evangelos Sakkopoulos, 2007. An Integrated Technique for Web SiteUsage Semantic Analysis: The ORGAN System, Journal of Web Engineering, (JWE), Rinton Press, Vol. 6, No 3, pp.261-280.4. Fuhrmann S., Krumke S.O., Wirth H.C., 2001. Multiple hotlink assignment. Proceedings of the Twenty-SeventhInternational Workshop on Graph- Theoretic Concepts in Computer Science.5. Miguel Vargas Martin, 2002. Enhancing Hyperlink Structure for Improving Web Performance, PhD thesis, CharletonUniversity.6. The WebSPHINX crawler main site: http://www.cs.cmu.edu/~rcm/websphinx. Carnegie Mellon University.7. Junghoo Cho and Hector Garcia-Molina, 2002. Parallel crawlers. In Proceedings of the eleventh internationalconference on World Wide Web, Honolulu, Hawaii, USA, pp 124–135. ACM Press.8. He, Bin et al, 2007. Accessing the Deep Web: A Survey. Communications of the ACM (CACM) 50, pp 94–101.9. D. Antoniou et al, 2009. Context-similarity based hotlinks assignment: Model, metrics and algorithm. In press forjournal Data & Knowledge Engineering.80

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!