Algorithms have entered courts, e.g. via scores assessing recidivism. At first sight, recent applications appear to be clear cases of solutionism, i.e. attempts at fixing social problems with technological solutions. Deploying thematic analysis on assessments of two of the most prominent and widespread examples of recidivism scores, COMPAS and the PSA, casts doubt on this notion. Crucial problems – as different as “fairness” (COMPAS) and “proper application” (PSA) – are not tackled in a technological manner but rather by installing conversations. It shows that even technorationalists never see the technological solution in isolation but are actively searching for flanking social methods thereby accounting for problems that cannot be eased technologically. Furthermore, we witness social scientists called upon as active parts of such engineering.