Quantcast
Channel: Very large matrices using Python and NumPy - Stack Overflow
Browsing all 12 articles
Browse latest View live

Answer by Mazdak for Very large matrices using Python and NumPy

Sometimes one simple solution is using a custom type for your matrix items. Based on the range of numbers you need, you can use a manual dtype and specially smaller for your items. Because Numpy...

View Article



Answer by wisty for Very large matrices using Python and NumPy

It's a bit alpha, but http://blaze.pydata.org/ seems to be working on solving this.

View Article

Answer by dwf for Very large matrices using Python and NumPy

Make sure you're using a 64-bit operating system and a 64-bit version of Python/NumPy. Note that on 32-bit architectures you can address typically 3GB of memory (with about 1GB lost to memory mapped...

View Article

Answer by Stephen Simmons for Very large matrices using Python and NumPy

PyTables and NumPy are the way to go.PyTables will store the data on disk in HDF format, with optional compression. My datasets often get 10x compression, which is handy when dealing with tens or...

View Article

Answer by SingleNegationElimination for Very large matrices using Python and...

Stefano Borini's post got me to look into how far along this sort of thing already is. This is it. It appears to do basically what you want. HDF5 will let you store very large datasets, and then access...

View Article


Answer by Roberto Bonvallet for Very large matrices using Python and NumPy

numpy.arrays are meant to live in memory. If you want to work with matrices larger than your RAM, you have to work around that. There are at least two approaches you can follow:Try a more efficient...

View Article

Answer by S.Lott for Very large matrices using Python and NumPy

Are you asking how to handle a 2,500,000,000 element matrix without terabytes of RAM? The way to handle 2 billion items without 8 billion bytes of RAM is by not keeping the matrix in memory.That means...

View Article

Answer by Alex Martelli for Very large matrices using Python and NumPy

To handle sparse matrices, you need the scipy package that sits on top of numpy -- see here for more details about the sparse-matrix options that scipy gives you.

View Article


Answer by DopplerShift for Very large matrices using Python and NumPy

You should be able to use numpy.memmap to memory map a file on disk. With newer python and 64-bit machine, you should have the necessary address space, without loading everything into memory. The OS...

View Article


Answer by Stefano Borini for Very large matrices using Python and NumPy

As far as I know about numpy, no, but I could be wrong. I can propose you this alternative solution: write the matrix on the disk and access it in chunks. I suggest you the HDF5 file format. If you...

View Article

Answer by Nick Dandoulakis for Very large matrices using Python and NumPy

Usually when we deal with large matrices we implement them as Sparse Matrices.I don't know if numpy supports sparse matrices but I found this instead.

View Article

Very large matrices using Python and NumPy

NumPy is an extremely useful library, and from using it I've found that it's capable of handling matrices which are quite large (10000 x 10000) easily, but begins to struggle with anything much larger...

View Article
Browsing all 12 articles
Browse latest View live




Latest Images