Yu-hao Wu Tay-sheng Jeng Yi-shin Deng - IAFOR

  • No tags were found...

Yu-hao Wu Tay-sheng Jeng Yi-shin Deng - IAFOR

Yu-hao WuTay-sheng JengYi-shin Deng

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanA Refined Heuristic Principles for Instructional Media Sharing Platform EvaluationYu-hao Wu, Tay-sheng Jeng, Yi-shin Deng0195National Cheng Kung University, TaiwanThe Asian Conference on Media and Mass Communication 2012Official Conference Proceedings 2012AbstractsThis research aims to find new heuristic principles especially used as instructional mediaon an internet sharing platform. Apart from Jakob Nielsen’s heuristic evaluation tenoriginal principles, this research would like to focus on developing and refining newheuristic evaluation principles for instructional multimedia.This research takes the examples of three similar functions of video sharing platforms(but different style). Users are asked to take the usability test to finish the tasks. Heuristicevaluation is considered as the method to evaluate the aforementioned platforms.Arranging the existing literature from user experience and heuristic evaluation field, thereare some similar regulated principles are listed down. These new principles are addedinto Jakob Nielsen’s ten original principles to evaluate the instructional media sharingplatforms. Experts from user experience domain are asked to complete the heuristicevaluation form in order to find the designs which are against the usability. The resultsfrom the method using of “heuristic evaluation” need to match the results from themethod using of “usability test”. Otherwise, the new principles should be reconsideredagain until it matches.Through taking the usability test, the users are encouraged to voice what they think andtheir observations about what they are doing and about their experience with the threedifferent platforms. The test was concluded with questions about general feelings andobservations about the whole exercise. The results from the usability test will be listeddown systematically and will be regarded as the final standard answer.Key words: heuristic, evaluation, usability, instruction, video, media, sharing, platformiaforThe International Academic Forumwww.iafor.org429

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanCHAPTER 1 Introduction1.1 Research backgroundTED (Technology, Entertainment and Design) is a global set of conferences owned bythe private non-profit Sapling Foundation, formed to disseminate "ideas worthspreading."“TEDTalks began as a simple attempt to share what happens at TED with the world.Under the moniker "ideas worth spreading," talks were released online. They rapidlyattracted a global audience in the millions. Indeed, the reaction was so enthusiasticthat the entire TED website has been reengineered around TEDTalks, with the goal ofgiving everyone on-demand access to the world's most inspiring voices.” (TEDwebsite)Take “TEDTalks” as a model, we could see that the ideas spreading is quite fast andchanges immediately in our lives. TEDTalks has provided a platform for people toshare their ideas not only in an entity lecture hall but also in a virtual internet. Thevideo is actually at the same time educating the people who were using the platform.Through the TED and sort of the online social media has become an innovated andcreative central issue. In this research, it would like to take TEDTalks as the exampleto transfer the innovation and creative thinking video on the prototype websitethrough the basic evaluation of Heuristics.1.2 Research objectives and aims of researchThis research objectives would like to focus on developing and refining a newheuristic evaluation principles of educational multimedia. Take Jakob Nielsen’soriginal ten heuristics(developed for evaluating software in general) as an example,this research is to refine a new principles for instructional media sharing platform onthe internet.1.3 Scope and limitationsHeuristic evaluation is a methodology for investigating the usability of softwareoriginally developed by Jakob Nielsen (1993, 2000), a widely acknowledged usabilityexpert. According to Nielsen (1994), heuristic evaluation “involves having a small setof evaluators examine the interface and judge its compliance with recognizedusability principles (the “heuristics”).”In this study, Nielsen’s protocol was modified and refined for evaluating e-learningprograms. In order to find the usability of the platform for the instructional sites, thedesigning of the platform will be considered as an issue and limitation in this research.Because of the designing of the platform issue, heuristic evaluation will be involved430

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japaninto the design process while designing the platform.And the selection of the participants will be one of the limitations in this research aswell. Both positive and negative comments were received for each of the three sets ofheuristics - interface, educational design and content - and in some instances theevaluators offered constructive suggestions for improvement. Examination of theresponses revealed that many of the problems had been noted by only one or two ofthe evaluators. The identified problems were used to develop a list for furtherinvestigation and remediation in the final version of the materials. (P. Albion 1999)CHAPTER 2 Literature Review2.1 Internet Video SharingThe online videos have existed long before YouTube popped up on the internet. Itwas very inconvenient for the users to upload videos, manage, share and watch themwith a lack of an easy-to-use integrated platform in that period of time. Moreimportantly, the videos distributed by traditional media servers and peer-to-peer filedownloads like BitTorrent were standalone units of content. There is no other relatedvideo clips connect for each single video, for example other episodes of a show thatthe user had just watched. And also, in the period of time, there was very little in theway of content reviews or ratings.Nowadays, the new changes of the video sharing sites, YouTube and its competitorshave overcome these problems in the new generation. They tried to let the contentsuppliers upload their video effortlessly, automatically converting from manydifferent formats, and tag the keywords for the uploaded videos. Users could easilyshare videos by mailing links to them, or embedding them on web pages or in blogs.They could also rate and comment on videos, or even bringing the new social aspectsto the viewing of videos. Thus, popular videos can rise to the top.2.1.1 The Social Network in Video Sharing Website – YouTubeThe social network existing in YouTube further enables communities and groups.Videos are no longer independent from each other, and neither are users. This hassubstantially contributed to the success of YouTube and similar sites.“YouTube” is one of the fastest-growing websites, and has become the 4th mostaccessed site in the Internet. It has become the most successful Internet site providinga new generation of short video sharing service. YouTube has a significant impact onthe Internet traffic distribution, and itself is suffering from severe scalabilityconstraints. Today, YouTube alone comprises approximately 20% of all HTTP traffic,431

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japanor nearly 10% of all traffic on the Internet. (Xu Cheng, C. D., Jiangchuan Liu. 2007)Understanding the features of YouTube and similar video sharing sites is crucial tonetwork traffic engineering and to sustainable development of this new generation ofservice.2.2 E-learningDefining e-learning as instruction delivered on a computer by way of CD-ROM,Internet, or intranet with the following features: Includes content relevant to the learning objective Uses instructional methods such as examples and practice to help learning Uses media elements such as words and pictures to deliver the content andmethods May be instructor-led (synchronous e-learning) or designed for selfpacedindividual study (asynchronous e-learning) Builds new knowledge and skills linked to individual learning goals or toimproved organizational performanceAs you can see, this definition has several elements concerning the what, how, andwhy of e-learning.What. e-Learning courses include both content (that is, information) and instructionalmethods (that is, techniques) that help people learn the content.How. e-Learning courses are delivered via computer using words in the form ofspoken or printed text and pictures, such as illustrations, photos, animation, or video.Some forms of e-learning (asynchronous) are designed for individual self-study. Newe-learning formats called virtual classrooms or synchronous e-learning are designedfor real-time instructor-led training. Both formats may support asynchronouscollaboration with others through tools such as wikis, discussion boards, and email.Why. e-Learning courses are intended to help learners reach personal learningobjectives or perform their jobs in ways that improve the bottom-line goals of theorganization.In short, the “e” in e-learning refers to the “how”: the course is digitized so it can bestored in electronic form. The “learning” in e-learning refers to the “what”: the courseincludes content and ways to help people learn it; and the “why” refers to the purpose:to help individuals achieve educational goals or to help organizations build skillsrelated to improved job performance. (Mayer, R. C. C. R. E. 2008)This definition brings out that the goal of e-learning is to build up a transferableknowledge and skills linked to organizational performance or to help individualsachieve personal learning goals. Although the guidelines were presented throughoutthe books which do apply to lessons designed for educational or general interest432

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japanlearning goals, the emphasis is on instructional programs that are built or purchasedfor workforce learning.2.3 Heuristic EvaluationHeuristic Evaluation (HE) is a usability engineering method for finding the usabilityproblems in a user interface design so that they can be attended to as part of aninteractive design process. Heuristic evaluation involves having a small set ofevaluators examine the interface and judge its compliance with recognized usabilityprinciples. Heuristic Evaluation is used to identify interaction problems with thecomputerized applications at the early assessment stage.2.3.1 Interface design heuristics (after Nielsen) (Albion, P. 1999)Visibility of system status: The system should always keep users informedabout what is going on, through appropriate feedback within reasonable time.Match between system and the real world: The system should speak the users'language, with words, phrases, and concepts familiar to the user, rather thansystem-oriented terms. Follow real-world conventions, making informationappear in a natural and logical order.User control and freedom: Users often choose system functions by mistake andwill need a clearly marked "emergency exit" to leave the unwanted state withouthaving to go through an extended dialogue. Support undo and redo.Consistency and standards: Users should not have to wonder whether differentwords, situations, or actions mean the same thing. Follow platform conventions.Error prevention: Even better than good error messages is a careful designwhich prevents a problem from occurring in the first place.Recognition rather than recall: Make objects, actions, and options visible. Theuser should not have to remember information from one part of the dialogue toanother. Instructions for use of the system should be visible or easily retrievablewhenever appropriate.Flexibility and efficiency of use: Accelerators-unseen by the novice user mayoften speed up the interaction for the expert user to such an extent that thesystem can cater to both inexperienced and experienced users. Allow users totailor frequent actions.Aesthetic and minimalist design: Dialogues should not contain informationwhich is irrelevant or rarely needed. Every extra unit of information in a dialoguecompetes with the relevant units of information and diminishes their relativevisibility.Help users recognize, diagnose, and recover from errors: Error messages433

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japanshould be expressed in plain language (no codes), precisely indicate the problem,and constructively suggest a solution.Help and documentation: Even though it is better if the system can be usedwithout documentation, it may be necessary to provide help and documentation.Any such information should be easy to search, focused on the user's task, listconcrete steps to be carried out, and not be too large.434

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanCHAPTER 3 Research Method3.1 FrameworkThis research based on Heuristic Evaluation heavily in order to refine new principlesfor instructional video sharing platform on the internet. Heuristic methods have beenshown to be cost-effective in the area of user interface evaluation. (P. Albion 1999)What is this research? Why is this research deserved to be study? How does thisresearch operate?In short, this research is trying to find the much more appropriate principles ofevaluating the interface for the instructional media sharing(e-learing) platform. Toimprove the usability when using, method for evaluating the interface now has beenlaunched by Jakob Nielsen, which is developed for evaluating software in general.This research is trying to refine the new principles especially for the way ofinstructional media sharing.Nowadays, in the education scope, teachers would like to share the digitalinformation (such as video, photos, curriculum slides…etc.) to the internet for thestudents to have more discussion on the internet. But the interaction between teachersand students were quite not common in the past experiences. This research assumesthat the interface design of the platform (eg. NCKU’s Moodle…) is the first importantreason for that. This research is to develop the new evaluation principles forinstructional media sharing interface in order to improve the usability between theplatform and the users (teachers and students). Therefore, it could then increase theinteraction on the internet between teachers and students or even more, the browsers.435

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japan3.2 Research ProcessTaking into the consideration of the above, this research is operating as follow:436

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japan3.3 MethodsThe experiment started base on usability testing and heuristic evaluation. Once theresults from usability testing are matched with the results from heuristic evaluation, itshould take consideration into the final principles. Three different website (NCTUOCW, TED and MIT) will be taken as the examples for the participants to do theexperiments. In order to find out the appropriate considerations of the principles,experiments should take several times to adjust the questions to reach the best one.CHAPTER 4 Usability Testing Experiment4.1 IntroducionThe original intention for “OpenCourseWare(OCW)” is the unlock and share theknowledge. But the translation for OpenCourseWare in Chinese “ 開 放 式 課 程 ” isusually mistaking the meaning into “an entirely opened course”. But in fact, the OCWis not a curriculum, it provides the sources and knowledge from the school, and theself-study people set the study goal by themselves. They could decide the learningways and the content by themselves. The OCW learning system is totally differentform the school or the long distance learning. In brief, OCW is not a set of a travelingpackage; it is more like a backpacking. The traveler could arrange their time and theschedule by the information they got in a detail way or in an easy comfortable way.NCTU has launched the website of NCTU OCW since 2007 with free of charge.From 2007 till now (2012), NCTU has set up 127 courses (include 103 courses withvideo). All the sources are posted on the OCW official website(http://ocw.nctu.edu.tw). The courses are followed by the original taken courses. Itprovides the entire content, outline, goal and video in order for self-learning.For the learner, OCW could let you know the content before the class start, in order toarrange your courses. Students could listen to the different teachers’ videos for thecross-referencing and to clarify the concept of its subject and understand the contentof the courses. For the students who had already taken the course, self-learner, and theworking people, they could refresh and re-experience the courses essentials. And also,for the teachers, they could pay more attention on its academic and teaching part.Because the curriculum resources is shared on the internet, the new teachers couldtake them as a reference for their preparation of teaching. It provides the students andthe self-learner multiple channels of learning in order to reach the goal of “LifelongLearning”.A usability test is intended to determine the extent an interface facilitates a user’sability to complete routine tasks. Typically the test is conducted with a group ofpotential users either in a usability lab, remotely (using e-meeting software andtelephone connection), or on-site with portable equipment. Users are asked to437

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japancomplete a series of routine tasks. Sessions are recorded and analyzed to identifypotential areas for improvement to the web site.This research conducted an onsite usability test using a live version of NCTU OCWlocated on the test administrator’s laptop. The laptops using SCREEN2EXE softwarecaptured the participant’s comments and the navigation choices. And also record thewhole testing time by Sony DV. The test administrator and participant were present inthe testing room. The session captured each participant’s navigational choices, taskcompletion rates, comments, overall satisfaction ratings, questions and feedback.4.2 Executive SummaryThis research conducted an onsite usability test during the October, 2012. Thepurpose of the test was to assess the usability of the web interface design, informationflow, and information architecture.Eight people participated in this usability test. Typically, a total of six to eightparticipants are involved in a usability test to ensure stable results. Each individualsession lasted approximately one hour.In general all participants found the NCTU OCW website to be clear, straightforward,and 62.5% (5 of the 8 participants) thought the web site was not difficult but also noteasy to use. 2 of the 8 participants (25%) thought the website was easy to use.The test identified only a few minor problems including:• The lack of the video frame capture in the area of speech on each videos• Lack of the link on each topic text of the course or lesson• Lack of the hot recommendation page(should be an independent page, not just anarea on the top of the course page or one of the area of the front page.)• Lack of a categorization page under the discussion page. / Discard the category of“calculus, physics, and chemistry field” but set the category choices in thequestionnaire for the user to choose from the dynamic dropdown list.• Lack of a categorization in the speech page. There are too many speecheslaunched on its page.• Confusion about the categorization of the college. (e.g., Institute of Architectureis in the College of Social Science in NCTU, but in the College of Planning andDesign in NCKU.)• Confusion about the search function bar. (e.g., what to type in the search bar)• Lack of the discussion or feedback on the video pages.• The place of the submenu is not directly. (e.g., the button of “curriculum frontpage”, “curriculum videos”, “curriculum outlines” and “curriculum schedule”.)This document contains the participant feedback, satisfactions ratings, taskcompletion rates, ease or difficulty of completion ratings, time on task, errors, and438

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japanrecommendations for improvements. A copy of the scenarios and questionnaires areincluded in the Attachments’ section.4.3 Methodology4.3.1 SessionsThe test administrator contacted and recruited participants via NCKU staff (includesthe assistant, students, teachers…etc.). The test administrator sent e-mails informingthem of the test logistics and requesting their availability and participation.Participants responded with an appropriate date and time.Each individual session lasted approximately one hour. During the session, the testadministrator explained the test session and asked the participant a brief introductoryquestions (see Attachment A). Participants read the task scenarios and tried to find theinformation on the website.After each task, the administrator asked the participant some questions in order to getthe user’s impressions on the task. Post-task scenario subjective measures included:• How easy it was to find the information from the home page.• Ability to keep track of their location in the website.• Accurateness of predicting which section of the website contained theinformation.After the last task was completed, the test administrator asked the participant the exitquestions to get the impressions of the website (see Attachment B). In addition, thetest administrator asked the participants the following overall website questions:• What the participant liked most.• What the participant liked least.• Recommendations for improvement.And asked them to rate the website overall by using a 5-point Likert scale (StronglyDisagree to Strongly Agree) (see Attachment C) for eight subjective measuresincluding:• Ease of use• Frequency of use• Difficulty to keep track of location in website• Learn ability - how easy it would be for most users to learn to use the website• Information facilitation – how quickly participant could find information• Look & feel appeal – homepage’s content makes me want to explore the sitefurther• Site content – site’s content would keep me coming back• Site organization439

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japan4.3.2 ParticipantsThe participants were from various types of field, it includes university student,graduate school student, assistant from professor, junior high school teacher andunemployed male.8 participants were scheduled over the testing. All of the 8 participants completed thetest. Of the 8 participants, 4 were male and 4 were female.ProfessionGraduate Assistant Junior HighUniversitySchool from SchoolStudentStudent Professor TeacherUnemployed1 4 1 1 14.3.3 Evaluation Tasks/ScenariosTest participants attempted completion of the following tasks:• Find the Introduction to OCW• Find the Important Message• Find the speech of “Crazy about the opera—Teaching you how to sing ACappella”• Find the course video of “Outlines of Atchitecture -97—Green Architecture”• Find the course outline content of “Outlines of Atchitecture -97—GreenArchitecture”• Find the Hot Recommendation Course “Liner Algebra I -- Department of AppliedMathematics”• Find the course handout of “Liner Algebra I -- Department of AppliedMathematics”• Search for—“ 統 計 學 Statistics 統 計 研 究 所 陳 鄰 安 老 師 ”• Conduct a new “Chemistry” topic discussion• Send the Feedback4.4 Results4.4.1 Task Completion Success RateAll participants successfully completed Task 2, 3, 4, 6, 7, 8 and 10. Five of the eight(62.5%) completed Task 1(find the introduction to OCW). Approximately whole(87.5%) of participants were able to complete Task 5 (find the course outline contentof “Outlines of Architecture -97—Green Architecture”) and 75% were able tocomplete Task 9 (conduct a new “Chemistry” topic discussion). Three of theparticipants (37.5%) had more intention to find the additional task. But two of thethree (66.6%) completed the additional task.440

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanTask DescriptionTask Test Description1 Find the Introduction to OCW2 Find the Important MessageFind the speech of “Crazy about the opera—Teaching you how to sing A3Cappella”4 Find the course video of “Outlines of Atchitecture -97—Green Architecture”Find the course outline content of “Outlines of Atchitecture -97—Green5Architecture”Find the Hot Recommendation Course “Liner Algebra I -- Department of6Applied Mathematics”Find the course handout of “Liner Algebra I -- Department of Applied7Mathematics”8 Search for—“ 統 計 學 Statistics 統 計 研 究 所 陳 鄰 安 老 師 ”9 Conduct a new “Chemistry” topic discussion10 Send the FeedbackTask Completion RatesParticipantTask1 2 3 4 5 6 7 8 9 10 111 √ √ √ √ √ √ √ √ √ √ na2 √ √ √ √ √ √ √ √ √ √ na3 √ √ √ √ √ √ √ √ √ √ na4 √ √ √ √ √ √ √ √ √ √5 √ √ √ √ √ √ √ √6 √ √ √ √ √ √ √ √ √ na7 √ √ √ √ √ √ √ √ √ √8 √ √ √ √ √ √ √ √ √ naSuccess 5 8 8 8 7 8 8 8 6 8 2Compl62.5 100 100 100 87.5 100 100 100 75 100 66etion% % % % % % % % % % %Rates4.4.2 Time on TaskThe testing software recorded the time on task for each participant. Some tasks wereinherently more difficult to complete than others and is reflected by the average timeon task.Task 4 required participants to find the lesson of Green Architecture video in the441

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japancourse of Outlines of Architecture-97semester and took the longest time to complete(mean = 350 seconds). However, completion times ranged from 60 (1 minute) to 350seconds (more than 5 minutes and approximately 6 minutes) with most times less than120 seconds (less than 2 minutes).Time on TaskParticipantTask1 2 3 4 5 6 7 8 9 10 111 62 18 50 60 60 72 80 165 40 10 na2 111 15 40 112 28 176 148 33 53 1 na3 18 3 35 90 19 13 50 20 39 15 na4 58 19 12 86 16 101 94 16 13 1 505 43 11 39 307 77 88 225 36 21 3 386 38 15 22 350 50 93 68 42 14 26 na7 33 5 55 65 38 50 30 26 13 32 108 39 14 44 200 32 49 85 44 43 41 naAvg. 50.2 37.1 158. 80.2 47.7 16.1 32.612.540 97.5 29.5TOT* 525 75 5525 674.4.3 Summary of DataThe table below displays a summary of the test data. Low completion rates andsatisfaction ratings and high errors and time on tasks are highlighted in red.For example:Summary of Completion, Errors, Time on TaskTask Task Completion Errors Time on Task1 5 2 502 8 0 133 8 3 374 8 8 1595 7 3 406 8 6 807 8 6 988 8 2 489 6 5 3010 8 2 164.5 Overall Metrics4.5.1 Overall RatingsAfter task session completion, participants rated the site for eight overall measures(See Attachment C). These measures include:442

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, Japan• Ease of use• Frequency of use• Difficulty of keeping track of where they were in the site• How quickly most people would learn to use the site• Getting information quickly• Homepage’s content facilities exploration• Relevancy of site content• Site organizationHalf of the participants (50%) agreed (i.e., agree or strongly agree) that they will usethis system frequently and that the site’s content would keep them coming back. Alsohalf of the participants (50%) agreed that the system is unnecessarily complex, andagreed the system very cumbersome to use as well. Only 12.5% agreed that they feltconfident while using the system.SUS Likert Scale (see Attachment C)StronglyStrongly PercentTaskDisagree Neutral AgreeDisagreeAgree Agreewould like to12345use thissystemfrequently2 2 3 1 50%found thesystemunnecessarily1 3 4 50%complexthought thesystem was1 5 2 25%easy to useneed thesupport of atechnicalperson to be4 4 0%able to usethis systemfound thevariousfunctions in1 5 2 25%this systemwere well443

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanTaskStronglyStrongly PercentDisagree Neutral AgreeDisagreeAgree Agreeintegratedtoo much6 inconsistency3 2 3 37.5%in this systemmost peoplewould learn7 to use this2 3 1 2 37.5%system veryquicklyfound the8910system verycumbersometo use1 1 2 4 50%confidentusing the 1 2 4 1 12.5%systemneeded tolearn a lot ofthings beforeI could get3 3 2 25%going withthis system*Percent Agree (%) = Agree & Strongly Agree Responses combined4.5.2 Likes, Dislikes, Participant RecommendationsUpon completion of the tasks, participants provided feedback for what they likedmost and least about the website, and recommendations for improving the website.Liked MostThe following comments capture what the participants liked most: The latest news Homepage video timeline part Speech content Course schedule, video (chapter list) Advice feedback Discussion page Homepage Platform design (clear, beautiful, no ads, easy to understand)444

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanLiked LeastThe following comments capture what the participants liked the least: Search function Category of course and speech (should be detailed list down) The category of the school college Hot recommendation The submenu placed in a indirectly area which is not obvious Cheerless of the feedback Font size too small Too many videos in the page of speech pageRecommendations for Improvement The link should be set on the topic. (e.g., hot recommendation, course and speechtopic) The homepage’s gradient color could be more nature. GUI could be much younger. (it’s too official) The submenu placed in a indirectly area which is not obvious. It could put it justunder the video or just under the title of the course or speech topic. The department category of the college is not clear. (Because there are differentcategories in different schools) More interactive dynamic dropdown list The search function should be more obviously and more functional consistently.CHAPTER 5 ConclusionMost of the participants found NCTU OCW to be well-designed of it platform, cleanand uncluttered. But most of them thought that it is not so easy to use for the first timeuser. It needs time to get familiar with it. In brief, user-centered design should takemore parts in this website design. Implementing the recommendations and continuingto work with users will ensure a continued user-centered website.445

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanCHAPTER 6 ReferencesAlbion, P. (1999). Heuristic evaluation of educational multimedia: from theoryto practice. Paper presented at the Australasian Society for Computers inLearning in Tertiary Education (ASCILITE), Brisbane, Austrlia.Govindasamy, T. (2001). Successful Implementation of E-Learning: PedagogicalConsiderations.Jakob Nielsen, R. L. M. (1994). Usability Inspection Methods: Wiley; 1 edition(April 25, 1994).Mayer, R. C. C. R. E. (2008). e-Learning and the Science of Instruction (pp.10-11).Quinn, C. N. (1996). Pragmatic evaluation: lessons from usability. Paperpresented at the Making new connections : ASCILITE 1996 : proceedings of the1996 annual conference, 1996.Reeves, T. C. B., Lisa; Elliott, Dean; Grant, Michael; Holschuh, Doug; Kim,Beaumie; Kim, Hyeonjin; Lauber, Erick; Loh, Sebastian. (2002). Usability andInstructional Design Heuristics for E-Learning Evaluation.Xu Cheng, C. D., Jiangchuan Liu. (2007). Understanding the Characteristics ofInternet Short Video Sharing: YouTube as a Case Study. Paper presented at theCoRR.Zhang, D., Zhao, J. L., Zhou, L., & Nunamaker, J. F. (2004). Can e-learningreplace classroom learning? Communications of the ACM, 47(5), 75-79. doi:10.1145/986213.986216446

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanCHAPTER 7 AppendixAttachment A-Brief Introductory Questions447

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanAttachment B-Exit Questions448

The Asian Conference on Media and Mass Communication 2012Official Conference ProceedingsOsaka, JapanAttachment C-5-point Likert scale449

More magazines by this user
Similar magazines