No.13977174 ViewReplyOriginalReport
Hello, /sci/, programmer here looking for help solving a math equation so I can implement it into one of my apps.

The problem: You have an image of width W and height H (WxH) and you'd like to make as many largest size square crops of it possible. By necessity, since images on a computer can only be rectangles, at least, to my knowledge, this means a number of crops of square that are equal in side length to the largest side.

So, if you have a 512x256 rectangle, you will create a number of 256x256 squares separated by some distance of pixels each (uhhh Z). The end user can input the amount of pixels - smaller amounts will produce more crops for a larger fidelity end image but also be obviously more resource intensive since more images are created and used.

so, for Z = 256, you create a crop at (0, 256), and then a crop at (256, 256), for a total of 2 images. N = 2

Z = 128, you create a crop at (0, 256), a crop at (128, 256), and a crop at (256, 256). N = 3

Z = 64, you create a crop at (0, 256), a crop at (64, 256), etc., N = 5.

So as Z is halved (in this particular instance with a 2:1 aspect ratio image), N increases at a rate of [(2^N) + 1]. I can understand that well enough.

What I am having difficulty is generalizing this to something I can plug into anything in order to make it parameterizable and thus user configurable. Any assistance would be appreciated.

MS paint example for you if it helps you visualize it.