• (leetcode题解)Two Sum II


    Given an array of integers that is already sorted in ascending order, find two numbers such that they add up to a specific target number.

    The function twoSum should return indices of the two numbers such that they add up to the target, where index1 must be less than index2. Please note that your returned answers (both index1 and index2) are not zero-based.

    You may assume that each input would have exactly one solution and you may not use the same element twice.

    Input: numbers={2, 7, 11, 15}, target=9
    Output: index1=1, index2=2

    题意是给定一个排好序的数组,找出两数之和等于target的位置。

    既然知道其中一个数,找另一个数存不存在,我立刻想到用哈希表来实现查找比较快捷,找到之后定位就好了。C++实现如下:

    vector<int> twoSum(vector<int>& numbers, int target) {
            unordered_map<int,int> hash_map;
            vector<int> res;
            for(auto &i:numbers)
                hash_map[i]++;
            int temp;
            int j=0;
            for(auto &i:numbers)
            {
                if(hash_map[target-i])
                {
                    res.push_back(j+1);
                    temp=target-i;
                    break;
                }
                j++;
            }
            for(int i=j+1;i<numbers.size();i++)
                if(temp==numbers[i])
                    res.push_back(i+1);
            return res;
    }

    参考网上,另一种解法我觉得也挺不错的,即用双指针,一个指向数组头,一个指向数组尾,通过移动两个指针在实现查找,因为数组本身是排好序的,所以这个是可以完美实现的。C++实现如下,以做参考:

    vector<int> twoSum(vector<int>& numbers, int target) {
            int l = 0, r = numbers.size() - 1;
            while (l < r) {
                int sum = numbers[l] + numbers[r];
                if (sum == target) 
                    return {l + 1, r + 1};
                else if (sum < target) 
                    ++l;
                else 
                    --r;
            }
            return {};
        }
  • 相关阅读:
    azkaben任务调度器
    HQL练习
    Hive基本操作
    Spark cache、checkpoint机制笔记
    2021年元旦云南之旅
    2020年总
    Windows Server 2016 如何恢复.NET Framework 4.6
    numpy和tensorflow中的广播机制
    查看spark RDD 各分区内容
    Spark RDD的默认分区数:(spark 2.1.0)
  • 原文地址:https://www.cnblogs.com/kiplove/p/6986738.html
Copyright © 2020-2023  润新知