Network Delay Time

Network Delay Time

A.題意

There are N network nodes, labelled 1 to N.

Given times, a list of travel times as directed edges times[i] = (u, v, w), where u is the source node, v is the target node, and w is the time it takes for a signal to travel from source to target.

Now, we send a signal from a certain node K. How long will it take for all nodes to receive the signal? If it is impossible, return -1.

Note:
1.N will be in the range [1, 100].
2.K will be in the range [1, N].
3.The length of times will be in the range [1, 6000].
4.All edges times[i] = (u, v, w) will have 1 <= u, v <= N and 1 <= w <= 100.

B.思路

很明顯這道題是最短路問題,這裏我用了bellman-ford算法,通過不斷更新dist的值,最後可以獲得最小值。

C.代碼實現

class Solution {
public:
    int networkDelayTime(vector<vector<int>>& times, int N, int K) {
        int max_Distance = 100 * 100;
        vector<int> dist(N, max_Distance);
        dist[K - 1] = 0;

        for (int i = 1; i < N; i++)
        {
            for (auto time : times)
            {
                int u = time[0] - 1, v = time[1] - 1, w = time[2];
                dist[v] = min(dist[v], dist[u] + w);
            }
        }
        int temp = *max_element(dist.begin(),dist.end());
        return temp == max_Distance ? -1 : temp;
    }
};
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章