The data engineers at Amazon are working on partitioning their server chains. There is a linear chain of n servers numbered from 1 to n, where the cost parameter associated with the ith server is represented by the array cost[i]. These servers need to be partitioned into exactly k different server chains. The cost of partitioning a server chain servers[i : j] is defined as cost[i] + cost[j]. The total cost is the sum of the partitioning cost of each server chain.
Given n servers, an array cost, and an integer k, find the minimum and maximum possible total cost of the operations and return them in the form of an array of size 2: [minimum cost, maximum cost].
Note: Partitioning of an array means splitting the array sequentially into two or more parts where each element belongs to exactly one partition. For an array [1, 2, 3, 4, 5], a valid partition would be [[1], [2, 3], [4, 5]], while [[1, 2], [2, 3], [4, 5]] and [[1, 3], [2, 4, 5]] would be considered invalid partitions.
Example:
Given cost = [1, 2, 3, 2, 5] and k = 3.
Asked in:
Amazon