There are N network nodes, labelled 1 to N.
Given times, a list of travel times as directed edges times[i] = (u, v, w), where u is the source node, v is the target node, and w is the time it takes for a signal to travel from source to target.
Now, we send a signal from a certain node K. How long will it take for all nodes to receive the signal? If it is impossible, return -1.
It’s an obvious Dijkstra algorithm problem.
One thing we need to care about is that we need check whether there are some distance values need to be update after a node is added into the set.
We distinguish a node is added into the set or not by judging whether its distance value is INT_MAX or not.
1 |
int networkDelayTime(vector<vector<int>>& times, int N, int K) { |
近期评论