Time Limit: 1000MS | Memory Limit: 65536KB | 64bit IO Format: %I64d & %I64u |
Description
Given S, a set of integers, find the largest d such that a + b + c = d where a, b, c, and d are distinct elements of S.
Input
Several S, each consisting of a line containing an integer 1 <= n <= 1000 indicating the number of elements in S, followed by the elements of S, one per line. Each element of S is a distinct integer between -536870912 and +536870911
inclusive. The last line of input contains 0.
Output
For each S, a single line containing d, or a single line containing "no solution".
Sample Input
5 2 3 5 7 12 5 2 16 64 256 1024 0
Sample Output
12 no solution
Source
Waterloo local 2001.06.02
/*在数列中找四个数使a+b+c=d;转化之后得a+b=d-c,先对数列排序列举每一个d-c, 二分查找a和b*/ #include<stdio.h> #include<string.h> #include<algorithm> using namespace std; #define MAX 100100 #define INF -0x3f3f3f int a[MAX]; int main() { int n; while(scanf("%d",&n),n) { for(int i=0;i<n;i++) scanf("%d",&a[i]); sort(a,a+n); int ans=INF; for(int i=n-1;i>=0;i--) { for(int j=n-1;j>=0;j--) { if(i==j) continue; int sum=a[i]-a[j]; int l=0,r=j-1; while(l<r) { if(a[l]+a[r]==sum) { ans=a[i]; break; } if(a[l]+a[r]>sum) r--; else l++; } if(ans!=INF) break; } if(ans!=INF) break; } if(ans!=INF) printf("%d ",ans); else printf("no solution "); } return 0; }