Maximum sum 2D:
Problem: Given 2D array with some +ve & -ve integers. You have to compute the sum of submatrix(subrectangle-may be square) with maximum sum.
Link: here
IDEA:
1.) Fix one column lft .
2.) Fix one column rgt from lft to rgtmost corner.
3.) For every fixed lft & rgt column, traverse below rowwise & calculate cumulative sum.
Note: We have to precalculate rowise sum earlier.
4.) If sum becomes negative then, make it 0 & update maximum sum on every step.
So, overall complexity is O(n^3).
So, overall complexity is O(n^3).
Source code:
#include <bits/stdc++.h> #define PI acos(-1.0) #define M 1000000 #define S(a) scanf("%d",&a) #define P(a) printf("%d\n",a) #define S2(a,b) scanf("%lld%lld",&a,&b) #define clr(a) memset(a,0,sizeof(a)) #define ff(a) memset(a,false,sizeof(a)) #define pb(x) push_back(x) #define all(v) v.begin(),v.end() #define fr(i,a,n) for(i=a;i<=n;i++) using namespace std; typedef long long ll; int i,j,k,n,m,lft,rgt,row,mx,sm,ans,rsum[105][105],ar[105][105]; int main() { while(S(n)==1) { fr(i,1,n)fr(j,1,n) { S(ar[i][j]); rsum[i][j]=ar[i][j]+rsum[i][j-1]; } mx=INT_MIN; fr(lft,1,n)fr(rgt,lft,n)///moving from lft col to corner in each step! { sm=0; fr(row,1,n) { sm+=rsum[row][rgt]-rsum[row][lft-1]; if(sm<0)sm=0; mx=max(mx,sm); } } P(mx); } return 0; }
Comments
Post a Comment