Testing an assessment of problem-solving in introductory chemical process design courses (WIP)

ISSN: 21535965
2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Problem-solving is consistently cited as one of the most important outcomes of an undergraduate education in engineering [1-3]. While it is generally held that scientists and engineers are trained to be good problem-solvers, there is very little research that confirms this belief. Indeed some work suggests that engineering graduates are ill-prepared to solve the complex problems they encounter in the workplace [4]. Substantial work has been devoted to characterizing student and expert problem-solving in physics [5-11] and engineering [12-14], but there are almost no agreed-upon measures of problem solving [8]. If we are to teach undergraduate students to solve complex, real-world problems we must be able to measure how well they are learning the necessary skills. In this work, we describe the testing of a new assessment to measure dimensions of problem-solving in undergraduate chemical engineering courses. Much of the empirical work in problem-solving has focused on differences between experts and novices as they solve structured problems [e.g. 6]. While this has provided valuable insights, such as the fact that novices focus on the surface features of problems while experts focus on the concepts underpinning the problem, it provides a limited picture of problem-solving because the problems that scientists and engineers encounter in the workplace are not well-structured. These ill-structured problems may have conflicting goals, multiple solution methods, multiple forms of representation, and non-engineering success standards [12]. Indeed, Hong, Jonassen and McGee (2003) found that solving these ill-structured problems involved higher-order metacognitive skills when compared with solving well-structured problems [15]. Price et al. conducted an empirical study of expert problem-solving that frames the process of an expert solving an ill-structured (authentic) problem in terms of the decisions that experts make [16]. They find a remarkably consistent set of approximately 30 decisions that experts make as they solve problems, such as deciding to decompose the problem into smaller pieces, deciding on an appropriate abstract representation of the problem (e.g. diagrams or equations), and deciding on the failure modes of a potential solution. These empirical findings are in line with theory that suggests decision-making represents the core processes in solving a variety of complex problems, such as design problems [17, 18]. Central to Price et al.'s empirical model of problem solving is an expert's predictive framework-similar to a mental model or schema in other problem-solving literature. The predictive framework is a mental representation of the problem's key features and the relationships between them, which allows the experts to make predictions and explain observations. A predictive framework has three key features: (1) it allows the expert to identify important problem elements and eliminate unimportant elements; (2) it allows the experts to explain relationships between these elements, which includes some degree of mechanistic reasoning; (3) the predictive framework is detailed enough that the expert is able to conduct thought experiments by manipulating important variables. In light of the work of Price et al., we sought to develop an assessment of engineering problem-solving by posing a problem that would require the solver to make some of the same decisions that an expert problem-solver makes. Textbook problems are not suited to this task because the expert decisions are often made for the solver-for example, assumptions are almost always given to the solver in physics problems, rather than allowing the solver to identify appropriate assumptions or simplifications. We found that a troubleshooting task, such as critiquing a flawed product design schematic, was well-suited for this assessment, as it requires the solver to make many of the expert decisions [19]. The general structure for the assessment, which may be applied in any science and engineering discipline, is depicted in Fig. 1. Here we create and test a chemical engineering problem-solving assessment based on this design.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Burkholder, E., & e Wieman, C. (2020). Testing an assessment of problem-solving in introductory chemical process design courses (WIP). In ASEE Annual Conference and Exposition, Conference Proceedings (Vol. 2020-June). American Society for Engineering Education.

Readers over time

‘21‘23‘2401234

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 2

67%

Lecturer / Post doc 1

33%

Readers' Discipline

Tooltip

Chemical Engineering 1

33%

Social Sciences 1

33%

Sports and Recreations 1

33%

Save time finding and organizing research with Mendeley

Sign up for free
0