The video reviews and compares six AI tools for generating academic literature reviews: SciSpace, Thesis AI, AnswerThis, ChatGPT, Gemini, and Manis AI. The creator uses a consistent, detailed prompt for each tool, requesting a graduate-level literature review on self-healing nanocomposite transparent electrodes, with an emphasis on synthesis, critical evaluation, and accurate referencing. The evaluation criteria include the number of references, length and depth of content, readability, exportability of the output, and susceptibility to AI detection tools.
In terms of references, Gemini and SciSpace stand out by providing the highest number of relevant citations (36 and 28, respectively), making them strong choices for users who prioritize comprehensive sourcing. Thesis AI and ChatGPT also perform reasonably well, while AnswerThis lags behind with only six references, failing to meet the prompt’s minimum requirement. The reviewer notes that Gemini’s output is particularly impressive for its inclusion of tables and synthesized results, offering a useful snapshot of the field.
When it comes to length, Thesis AI produces the most extensive output, generating content suitable for a thesis-level literature review, while SciSpace and Gemini also deliver substantial and detailed reviews. AnswerThis, on the other hand, produces the shortest and least detailed output, making it less suitable for in-depth academic work. The reviewer emphasizes that while length alone isn’t everything, a literature review should be detailed enough to provide meaningful insights and thematic organization.
Readability is another key factor, with Thesis AI receiving the highest marks for academic tone and clarity, despite being somewhat wordy. The reviewer criticizes some tools for using unnecessarily complex or obscure vocabulary, which can detract from readability and make the output feel artificially inflated. AnswerThis, in particular, is noted for awkward phrasing and less natural academic language, while the other tools generally fall in the middle range for readability.
Exportability is crucial for academic workflows, and Thesis AI excels by offering multiple export formats, including PDF, Word, and Overleaf (LaTeX), making it highly adaptable for further editing. AnswerThis also provides good export options, while SciSpace requires payment for certain formats and ChatGPT’s export process is cumbersome, often requiring manual copying. All tools, however, are easily detected as AI-generated by originality-checking software, so users should treat the outputs as starting points rather than final submissions. Overall, SciSpace and Thesis AI emerge as the top choices, with Thesis AI being the reviewer’s preferred tool for its balance of references, readability, and exportability.
