Lines Matching refs:size
83 * call before __tlb_remove_page*() to set the current page-size; implies a
127 * returns the smallest TLB entry size unmapped in this range.
138 * changes the size and provides mmu_gather::page_size to tlb_flush().
140 * This might be useful if your architecture has size specific TLB
567 unsigned long address, unsigned long size)
569 __tlb_adjust_range(tlb, address, size);
574 unsigned long address, unsigned long size)
576 __tlb_adjust_range(tlb, address, size);
581 unsigned long address, unsigned long size)
583 __tlb_adjust_range(tlb, address, size);
588 unsigned long address, unsigned long size)
590 __tlb_adjust_range(tlb, address, size);