Výsledky vyhledávání - Multimedia Systems o Web Programming HTML*

Upřesnit hledání
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11

    Zdroj: CISTI (Iberian Conference on Information Systems & Technologies / Conferência Ibérica de Sistemas e Tecnologias de Informação) Proceedings; 2013, Vol. 1, p926-931, 6p

  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20

    Popis souboru: 175 páginas; application/pdf

    Relation: [1] F. Ceriani, R. Fogilani, y A. Kustermann, "Facial expressions," en Development of Normal Fetal Movements: The First 25 Weeks of Gestation: Springer Milan, 2010, pp. 77-86.; [2] D. Keltner y P. Ekman, "Introduction: Expression of Emotion," en Handbook of Affective Sciences, 2009, pp. 411-414.; [3] T. Goetz, O. Lüdtke, U. E. Nett, M. M. Keller, y A. A. Lipnevich, "Characteristics of teaching and students’ emotions in the classroom: Investigating differences across domains," Contemporary educational psychology, vol. 38, no. 4, pp. 383-394, 2013, doi:10.1016/j.cedpsych.2013.08.001.; [4] M. Antonia, D. Acedo, F. Ca, J. Sanchéz, y V. Mellado, "Química educación Las emociones en el aprendizaje de física y química," Educ. química, vol. 27, pp. 217-225, 2016, doi:10.1016/j.eq.2016.04.001.; [5] E. Cambria, "Affective computing and sentiment analysis," A practical guide to sentiment analysis, vol. 31, no. 2, pp. 102-107, 2016, doi:10.1109/MIS.2016.31.; [6] P. Ekman, "Universal Facial Expressions of Emotion," California mental health research digest, vol. 8, no. 4, pp. 151-158, 1970. [Internet]. Disponible en: https://www.paulekman.com/wp-content/uploads/2013/07/Universal-Facial- Expressions-of-Emotions1.pdf.; [7] C. Lisetti y F. Nasoz, "Using noninvasive wearable computers to recognize human emotions from physiological signals," EURASIP Journal on Advances in Signal Processing, vol. 2004, no. 11, pp. 1-16, 2004, doi: https://doi.org/10.1155/S1110865704406192.; [8] R. Jannat, I. Tynes, J. Adorno, S. Canavan, y L. LaLime, "Ubiquitous emotion recognition using audio and video data," UbiComp/ISWC 2018 - Adjunct Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers, pp. 956-959, 2018, doi:10.1145/3267305.3267689.; [9] Z. Tariq, S. K. Shah, y Y. Lee. Speech Emotion Detection using IoT based Deep Learning for Health Care.; [10] S. D’Mello et al., "AutoTutor detects and responds to learners affective and cognitive states," en Workshop on emotional and cognitive issues at the international conference on intelligent tutoring systems, 2008, pp. 306-308. [Internet]. Disponible en: https://www.media.mit.edu/publications/autotutor-detects-and-responds-tolearners- affective-and-cognitive-states-2/.; [11] T. Mainhard, S. Oudman, L. Hornstra, R. J. Bosker, y T. Goetz, "Student emotions in class: The relative importance of teachers and their interpersonal relations with students," vol. 53, pp. 109–119, 2018, doi:10.1016/j.learninstruc.2017.07.011.; [12] J. D. Mayer y P. Salovey, "The intelligence of emotional intelligence," vol. 17, pp. 433–442, 1993, doi:10.1016/0160-2896(93)90010-3.; [13] R. Pressman, Ingeniería del software: Un enfoque práctico, 7 ed. McGraw Hill, 2010.; [14] I. Sommerville, Person, Ed. Software engineering, 10 ed. 2015.; [15] ISO 9241-210:2019 Ergonomics of human-system interaction — Part 210: Human-centred design for interactive systems, I. O. f. S. (ISO), 2019. [Internet]. Disponible en: https://www.iso.org/obp/ui/#iso:std:iso:9241:-210:ed-2:v1:en; [16] ISO (2021,jul,30) 'Ergonomics of human–system interaction — Part 210: Human-centred design for interactive systems', [Internet] Disponible en https://www.iso.org/obp/ui/#iso:std:iso:9241:-210:ed-2:v1:en; [17] Adobe.(2021, Jun, 30) "HUMAN COMPUTER INTERACTION." [Internet] Disponible en https://xd.adobe.com/ideas/principles/human-computer-interaction/; [18] M. Soegaard y R. F. Dam, Encyclopedia of human-computer interaction, 3 ed. The Interaction Design Foundation, p. 1.; [19] I. El Naqa y M. J. Murphy, "What is machine learning?," en machine learning in radiation oncology: Springer, 2015, pp. 3-11.; [20] I. Goodfellow, Y. Bengio, y A. Courville, Deep Learning. MIT Press, 2016, p. 800.; [21] Y. Bengio, A. Courville, y P. Vincent, "Representation learning: A review and new perspectives," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 8, pp. 1798-1828, 2013, doi:10.1109/TPAMI.2013.50.; [22] S. Albawi, T. A. Mohammed, y S. Al-Zawi, "Understanding of a convolutional neural network," en 2017 International Conference on Engineering and Technology (ICET), 2017: Ieee, pp. 1-6, doi:10.1109/ICEngTechnol.2017.8308186.; [23] R. W. Picard, Affective Computting, first ed ed. (MIT Press). Cambridge, MA 1997.; [24] S. B. Daily et al., "Affective computing: historical foundations, current applications, and future trends," en Emotions and affect in human factors and human-computer interaction: Elsevier, 2017, pp. 213-231.; [25] C. E. Izard, The psychology of emotions. Springer Science & Business Media, 1991.; [26] C. Darwin, The Expression of the Emotions in Man and Animals. London: John Murray, 1872.; [27] K. R. Scherer, "Expression of emotion in voice and music," Journal of voice, vol. 9, no. 3, pp. 235-248, 1995. [Internet]. Disponible en: https://doi.org/10.1016/S0892-1997(05)80231-0.; [28] P. Ekman, W. V. Friesen, y S. S. Tomkins, "Facial affect scoring technique: A first validity study," vol. 3, no. 1, pp. 37-58, 1971. [Internet]. Disponible en: https://www.paulekman.com/wp-content/uploads/2013/07/Facial-Affect-Scoring-Technique-A-First-Validity-Study.pdf.; [29] P. Ekman, "Facial Expression," en Nonverbal behavior and communication, A. Siegman and S. Feldstein Eds. New Jersey: Lawrance Erlbaum Associates, 1977.; [30] A. Ortony y T. Turner, "What's basic about basic emotions?," vol. 97, no. 3, p. 315, 1990, doi:10.1037/0033-295x.97.3.315.; [31] P. Salovey y J. D. Mayer, "Emotional intelligence," Imagination, cognition and personality, vol. 9, no. 3, pp. 185-211, 1990. [Internet]. Disponible en: https://doi.org/10.2190/DUGG-P24E-52WK-6CDG.; [32] D. Wechsler, The measurement and appraisal of adult intelligence. 1958.; [33] H. E. Gardner, Frames of mind: The theory of multiple intelligences. Hachette Uk, 2011.; [34] "COVID-19: cronología de la actuación de la OMS." https://www.who.int/es/news-room/detail/27-04-2020-who-timeline---covid-19. (Consultado en Jul. 4, 2021).; [35] "5 países que no han reportado casos de coronavirus." https://www.ngenespanol.com/el-mundo/5-paises-que-no-han-reportado-casos-decoronavirus/ (Consultado en Jul. 4, 2021).; [36] S. Maneiro. "¿Cómo prepararse para la reapertura? estas son las recomendaciones del IESALC para planificar la transición a la nueva normalidad." https://www.iesalc.unesco.org/2020/06/18/como-prepararse-para-la-reaperturaestas- son-las-recomendaciones-del-iesalc-para-planificar-la-transicion-hacia-lanueva- normalidad/ (Consultado en Jul. 4, 2021).; [37] L. Pérez. "Comunicado de Rectoría sobre la modalidad para el periodo 2020- 03." https://www.uao.edu.co/noticias/comunicado-de-rectoria-sobre-la-modalidadpara- el-periodo-2020-03 (Consultado en Jul. 4, 2021).; [38] "Horarios de clases presenciales para asignaturas de primer semestre," ed: Universidad Autónoma de Occidente, 2021.; [39] "ABC del retorno progresivo al Campus: Guía para el inicio de clases bajo el modelo de alternancia." ttps://drive.google.com/file/d/163kalhapOXUd1gOUvyaosyP_AyS4R9c6/view (Consultado en Jul. 1, 2021).; [40] A. L. Jones y M. A. Kessler, "Teachers’ Emotion and Identity Work During a Pandemic," frontiers in Fducation, vol. 5, 2020, Art no. 583775, doi:10.3389/feduc.2020.583775.; [41] A. Pragholapati, "Covid-19 impact on students," p. 4, 2020. [Internet]. Disponible en: https://edarxiv.org/895ed/.; [42] J. V. Pradilla, "Formato de Programa de Curso: Arquitectura de Sistemas Multimedia," ed, 2020.; [43] M. Feidakis, T. Daradoumis, S. Caballé, y J. Conesa, "Embedding emotion awareness into e-learning environments," vol. 9, no. 7, pp. 39-46, 2014, doi:10.3991/ijet.v9i7.3727.; [44] M. Feidakis, T. Daradoumis, S. Caballé, J. Conesa, y D. Gañán, "A Dual- Modal System that Evaluates User’s Emotions in Virtual Learning Environments and Responds Affectively," 2013. [Internet]. Disponible en: https://www.researchgate.net/publication/259866927_A_Dual- Modal_System_that_Evaluates_User's_Emotions_in_Virtual_Learning_Environme nts_and_Responds_Affectively.; [45] S. Abdolhossein, A. Sam, y H. Stephen, "Easy with eve: A functional affective tutoring system," 2006. [Internet]. Disponible en: https://www.researchgate.net/publication/228818312.; [46] H. L. Fwa, "An architectural design and evaluation of an affective tutoring system for novice programmers," vol. 15, no. 1, pp. 1-19, 2018. [Internet]. Disponible en: https://doi.org/10.1186/s41239-018-0121-2.; [47] "Classmood." https://classmood.com (Consultado en Ago. 24, 2021).; [48] H. Prendinger, H. Dohi, H. Wang, S. Mayer, y M. Ishizuka, "Empathic Embodied Interfaces: Addressing Users’ Affective State," 2014, doi:10.1007/978-3- 540-24842-2_6.; [49] "Interacción e interfaces." Universitat Oberta de Catalunya. http://designtoolkit- test.uoc.edu/es/interaccion-e-interfaces/ (Consultado en Ago. 15, 2021).; [50] C. González, E. González, V. Muñoz, y J. Saavedra, "Integrating the Design Thinking into the UCD's methodology," 2010. [Internet]. Disponible.; [51] P. Ekman, W. V. Friesen, y S. Tomkins, "Facial affect scoring technique: A first validity study," vol. 3, no. 1, pp. 37-58, 1971. [Internet]. Disponible en: https://www.paulekman.com/wp-content/uploads/2013/07/Facial-Affect-Scoring- Technique-A-First-Validity-Study.pdf.; [52] "Challenges in Representation Learning: Facial Expression Recognition Challenge." Kaggle. https://www.kaggle.com/c/challenges-in-representationlearning- facial-expression-recognition-challenge/data (Consultado en Jun. 13, 2021).; [53] A. Konar y A. Chakraborty, Emotion recognition: A pattern analysis approach. John Wiley & Sons, 2015.; [54] R. A. Española. (2020, Nov, 09) "Necesidad."[Internet] Disponible en https://dle.rae.es/necesidad.; [55] G. Booch, J. Rumbaugh, y I. Jacobson, Lenguaje Unificado de Modelado Manual de Referencia, 2nd ed. Pearson, 2011.; [56] "Data Model Design." mongoDB. https://docs.mongodb.com/manual/core/data-model-design/#std-label-datamodeling- embedding (Consultado en Jun. 27, 2021).; [57] M. Coto, S. Mora, B. Grass, y J. Murillo-Morera, "Emotions and programming learning: systematic mapping; [58] "Material design." material design. https://material.io (Consultado en Jun. 15, 2021).; [59] J. Nielsen. "10 Usability Heuristics for User Interface Design." Nielsen Norman Group. https://www.nngroup.com/articles/ten-usability-heuristics/(Consultado en Jun. 12, 2021).; [60] M. Soegaard, "Gestalt principles of form perception," en The Glossary of Human Computer Interaction, ch. 22.; 61; [62] "Dark theme." Material Design. https://material.io/design/color/darktheme.html#usage (Consultado en Jun. 12, 2021).; [63] "The color system." Material Design. https://material.io/design/color/the-colorsystem. html#color-theme-creation (Consultado en Jun. 18, 2021).; [64] "Color Tool." Material Design. https://material.io/resources/color/#!/?view.left=0&view.right=0&primary.color=FAFAFA&secondary.color=ffdf00 (Consultado en Jun. 18, 2021).; [65] V. Ayalasomayajula. "Visualizing Time Series Data: 7 Types of Temporal Visualizations." humans of data. https://humansofdata.atlan.com/2016/11/visualizing-time-series-data/ (Consultado en Jun. 18, 2021).; [66] "¿Qué es Instagram Live?" educo. https://www.educo.org/blog/que-esinstagram- live (Consultado en Jun. 14, 2021).; [67] "Keras." Keras. https://keras.io (Consultado en Jul. 4, 2021).; [68] "PyTorch." PyTorch. https://pytorch.org (Consultado en Jul. 4, 2021).; [69] "scikit-learn Machine Learning in Python." scikit learn. https://scikitlearn. org/stable/ (Consultado en Jul. 4, 2021).; [70] "2020 Developer Survey." Stack Overflow. https://insights.stackoverflow.com/survey/2020 (Consultado en Jul. 4, 2021).; [71] "Electron." Electronjs. https://www.electronjs.org (Consultado en Jul. 4, 2021).; [72] Eel. Github. Consultado en: Jul. 4, 2021. [Internet]. Disponible en: https://github.com/ChrisKnott/Eel; [73] N. Sharma. "How to do Facial Emotion Recognition Using A CNN?" Medium. https://medium.com/themlblog/how-to-do-facial-emotion-recognition-using-a-cnnb7bbae79cd8f (Consultado en Jun. 28, 2021).; [74] "Python Mini Project – Speech Emotion Recognition with librosa." DataFlair. https://data-flair.training/blogs/python-mini-project-speech-emotion-recognition/ (Consultado en Jun. 30, 2021).; [75] S. R. Livingstone y F. A. Russo, "The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English," vol. 13, no. 5, 2018. [Internet]. Disponible en: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0196391.; [76] "crypto-js." npm. https://www.npmjs.com/package/crypto-js (Consultado en Jun. 29, 2021).; [77] "Passport." Passport.js. http://www.passportjs.org (Consultado en Jun. 29, 2021).; [78] G. v. Rossum. "PEP 8 -- Style Guide for Python Code." https://www.python.org/dev/peps/pep-0008/ (Consultado en Jun. 14, 2021).; [79] "Google JavaScript Style Guide." Google. https://google.github.io/styleguide/jsguide.html (Consultado en Jun. 14, 2021).; [80] K. Moran, "Usability Testing 101." [Internet]. Disponible en: https://www.nngroup.com/articles/usability-testing-101/; [81] J. Nielsen, Usability engineering. Morgan Kaufmann, 1994.; [82] B. Tognazzini. "First Principles of Interaction Design (Revised & Expanded)." asktog. https://asktog.com/atc/principles-of-interaction-design/ (Consultado en Jul. 2, 2021).; [83] L. Paganelli y F. Paterno, "Intelligent analysis of user interactions with web applications," en Proceedings of the 7th international conference on Intelligent user interfaces, 2002, pp. 111-118, doi:10.1145/502716.502735.; [84] T. Granollers, MPIu+ a. Una metodología que integra la Ingeniería del Software, la Interacción Persona-Ordenador y la Accesibilidad en el contexto de equipos de desarrollo multidisciplinares. Universitat de Lleida, 2007.; [85] M. F. Lopez, C. Rusu, y R. Villarroel, "Métodos de evaluación de usabilidad para aplicaciones web transaccionales," Pontificia Universidad Católica de Valparaíso-Chile, vol. 142, 2012. [Internet]. Disponible en: http://opac.pucv.cl/pucv_txt/txt-3000/UCF3276_01.pdf.; [86] "Usability Test Plan Template." Usability.gov. https://www.usability.gov/howto- and-tools/resources/templates/usability-test-plan-template.html (Consultado en Jul. 3, 2021).; [87] P. Laubheimer, "Beyond the NPS: Measuring Perceived Usability with the SUS, NASA-TLX, and the Single Ease Question After Tasks and Usability Tests." [Internet]. Disponible en: https://www.nngroup.com/articles/measuring-perceivedusability/; https://hdl.handle.net/10614/13320; Universidad Autónoma de Occidente; Repositorio Educativo Digital; https://red.uao.edu.co/