High dimensional vector quantization plays an important role in KNN search on large datasets. In recent years, there have a large literature on vector quantization such as product quantization(PQ), optimized product quantization(OPQ), additive quantization (AQ), stacked quantization(SQ). However, these vector quantization faced with large quantization error or low efficiency codebook learning and encoding. In this paper, we propose a new vector quantization method called SPQ which combines the strength of PQ and SQ. On one hand, compared with PQ, we can get a more precise subcodebook in each subspace. On the other hand, we can generate codebook within consuming less time and memory than SQ. Extensive experiments on benchmark datasets demonstrate that SPQ can generate codebook and encoding faster than SQ while maintain the same quantization error. Furthermore we show that SPQ have good scalability, which compare favorably with the sate-of-the-art.