I have a 0.3 GB sized CSV file. Every time I try to load the data in jupyter notebook, my system freezes. I have a 4gb ram.
What could be the possible solutions?
You should use Google Colab to work on this dataset because 4GB RAM will not be able to handle such a large dataset or you can experiment with the chunksize parameter of the pandas read_csv function.