By Peter Murphy
Published November 1, 2024
Computational notebooks have grown increasingly common in engineering curricula. A University at Buffalo-led study will analyze how these tools help deliver student learning outcomes and develop a rubric to evaluate, and ultimately enhance computational notebooks in engineering courses.
These digital tools allow users to run code, see the results and take notes in one document. The notebooks allow engineering students to see instructional notes and diagrams, solve problems and analyze data in real time in an interactive document.
The $200,000 project from the National Science Foundation is led by Viviana Monje, assistant professor in the Department of Chemical and Biological Engineering. Ashlee Ford Versypt, associate professor of chemical and biological engineering; and Matilde Sánchez-Peña, assistant professor in the Department of Engineering Education, will serve as co-principal investigators and mentors.
“On one end, they help students learn a programming language, which, in itself, is beneficial in the current job market,” Monje says. “On the other hand, they have the potential to impact their learning experience, which is the focus of our study.”
Computational notebooks, like Jupyter, Google Colab and MATLAB live scripts, are often used in engineering courses at the undergraduate level. Monje, Ford Versypt and Sánchez-Peña will implement their notebooks in a chemical engineering statistics and data analysis introduction course.
Despite the ubiquity of notebooks and other computational tools in engineering courses, there is a notable lack of data associated with the selection and organization of content in the notebooks and the associated student learning progress.
“There is limited literature on this topic in the context of general engineering education. There are limited guidelines on how to design these notebooks to quantitatively analyze whether they facilitate student learning,” Monje says. “Additionally, there are no evidence-based protocols to select key concepts in engineering courses, other than based on instructor experience over multiple terms teaching a given subject.”
These tools are usually built by the instructor on a trial-and-error basis. Currently, there is no rubric or other system that exists to objectively evaluate their learning outcomes and effectiveness. The researchers will use this grant to help engineering instructors design notebooks that are more intentional, delivering key concepts to students based on research evidence.
Monje will build and implement the notebooks in her course “CE 305: Probability, Statistics and Data Analysis.” While the specific deliverables in this project are associated with undergraduate education in chemical engineering, the researchers believe their findings can extend to other engineering disciplines.
“Nearly all engineering programs have a computational/numerical methods course in their undergraduate curricula,” says Monje. “Most undergraduates in our classrooms use digital devices to take notes. The digital era and use of AI in the classroom are not going away any time soon. It is certainly a good investment to be intentional in the design of computational notebooks to enhance the learning experience of students and teach them how to use technology responsibly and ethically.”
Each researcher has experience using the notebooks in different settings. Ford Versypt uses notebooks to train new undergraduate researchers and in chemical engineering courses and interdisciplinary electives. Monje uses MATLAB to provide students in her statistics class with examples of numerical simulation, graphical representations for data and hypothesis testing. Sánchez-Peña has evaluated their effectiveness in student learning and has infused the cognitive apprenticeship model into notebooks she has designed for data science learning, the same model that will be employed to implement the notebooks in this project.
The cognitive apprenticeship model helps students develop the cognitive skills necessary to develop real-world problem-solving ability. Students start out reproducing the steps of an expert and gradually become more independent. Traditional apprenticeship focuses on physical tasks, but the cognitive apprenticeship model focuses on thinking processes and problem-solving skills. The computational notebooks designed by Monje, Ford Versypt and Sánchez-Peña will not only implement this learning model but also develop guidelines to evaluate it and iterate its design to enhance student learning.
“We envision a set of guidelines to systematically identify key concepts in a subject based on student performance from previous semesters; select the content of the computational notebook to enable qualitative and quantitative analysis of student learning gains at the end of the semester to guide the updates for future iterations of the notebook; and objectively evaluate student interaction and performance using the notebook,” Monje says.
The first version of the notebooks will be implemented in fall 2025. Researchers will base its contents on previous student performance and engagement with course materials.