The database contains years of data from all kinds of input sources, probably including copy/paste from MS Word, and some of the characters can’t be processed as JSON if taken “as is” without character set identification.
I discovered this when a new endpoint was crashing with a 500 server error. After xdebugging the problem right down into Laravel’s core, I found that the
Response object, in building its output, was NULLing my data when trying to JSONify it in
morphToJson(). So when this processed content gets passed to
setContent(), it fails the
is_callable(array($content, '__toString')) test and throws an exception.
I had already suspected this was an encoding problem, so I’d found the solution buried, and apparently ignored, here: https://laracasts.com/discuss/channels/general-discussion/laravel-5-problem-with-utf-8-in-a-json-response.
Quite simply, in your freetds.conf, in the
[global] settings area, set
client charset = UTF-8.
And done. The funky characters are correctly interpreted, don’t NULL the JSON, and don’t crash the API.