Jetbrains documentation on running SQL scripts does not provide a ton of information on processing large insert statements. There is a discussion in the Datagrip community forums and apparently upcoming features to make working with large scripts easier.
Quote from thread:
Huge SQL files can be executed from Files view (use a context menu action).
I assume you are attempting to import a database export which is a series of SQL statements saved to a file. There could be a memory issue if you are attempting to run a large SQL file in memory. Try the following.
Insert commit statements in your SQL file in a text editor. This can even be done from within datagrip. Every couple of hundred statements you can place the line
commit;
which should purge the previous statements from memory. I strongly recommend saving the file which you edit separately from the export script. This method is not applicable if you need an all or nothing import, meaning if even one statement or block fails you want all of the statement to be rolled back.