Systems and methods for generating wind power scenarios for wind-power-integrated stochastic unit commitment problems

Information

  • Patent Grant
  • 8949160
  • Patent Number
    8,949,160
  • Date Filed
    Wednesday, March 7, 2012
    12 years ago
  • Date Issued
    Tuesday, February 3, 2015
    9 years ago
Abstract
The present disclosure relates generally to systematic algorithms (and associated systems and methods) that take a forecast model as input and produce a discrete probability distribution as output, using scenario reduction ideas from stochastic programming. In one example, an algorithm (and associated system and method) creates scenarios sequentially for each time period, leading to a scenario tree.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present disclosure relates to commonly-owned, co-pending U.S. patent application Ser. No. 13/414,044, filed Mar. 7, 2012, entitled SYSTEMS AND METHODS FOR SOLVING LARGE SCALE STOCHASTIC UNIT COMMITMENT PROBLEMS, the entire contents and disclosure of which is incorporated by reference as if fully set forth herein.


BACKGROUND

The present disclosure relates generally to systematic algorithms (and associated systems and methods) that take a forecast model as input and produce a discrete probability distribution as output, using scenario reduction ideas from stochastic programming. In one example, an algorithm (and associated system and method) creates scenarios sequentially for each time period, leading to a scenario tree. These algorithms (and associated systems and methods) contrast with conventional methods that typically create scenarios for all time periods simultaneously and/or which do not form a scenario tree.


DESCRIPTION OF RELATED ART

Optimization under uncertainty has received the attention of many researchers in the last few decades (Sahinidis 2004). Many real-life problems under uncertainty can be modeled as a stochastic programming problem in the form:









min

x

X





𝔼
P



f


(

ω
,
x

)







min

x

X






Ω







f


(

ω
,
x

)




P
(






ω

)





,





where X⊂custom charactern is a given nonempty convex closed set, Ω a closed subset of custom characters and custom character the Borel σ-field relative to Ω, the function ƒ from Ω⊂custom charactern to the extended reals custom character is measurable with respect to ω and lower semi-continuous and convex with respect to x, and P is a fixed probability measure on (Ω, custom character). This formulation models two-stage and multi-stage stochastic programming problems with recourse, where X is the feasible set of decisions and ƒ(ω, x) evaluates the optimized decisions under the scenario that ω is realized.


SUMMARY

As described herein, one embodiment provides a method that takes wind speed forecasts as input, and uses Monte-Carlo sampling and scenario reduction techniques to generate scenarios that form a scenario tree. Conventional methods typically either don't use scenario reduction techniques (which may be crucial in decreasing the problem size) and/or don't produce a scenario tree as output. Having scenario trees is important in stochastic programming because using scenario trees leads to dynamic recourse decisions in the stochastic programming models.


In one embodiment, a system for providing scenario tree generation based at least in part upon scenario reduction is provided, the system comprising one or more processor units configured for: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.


In another embodiment, a system for providing scenario tree generation based at least in part upon scenario reduction is provided, the system comprising one or more processor units configured for: receiving for each of a plurality of time periods a respective forecast, wherein the plurality of time periods comprise at least a first time period, at least a second time period and at least one intermediate time period, and wherein each forecast comprises a respective set of scenarios; for each time period in sequence, reducing the respective set of scenarios to a respective subset of scenarios, wherein each subset of scenarios comprises a subset of the respective set of scenarios; and generating a scenario tree based at least in part upon the subset of scenarios; wherein the scenario tree has interperiod independency.


In another embodiment, a method for providing scenario tree generation based at least in part upon scenario reduction is provided, the method comprising: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.


In another embodiment, an article of manufacture is provided, comprising: at least one tangible computer usable medium having a computer readable program code logic tangibly embodied therein to execute at least one machine instruction in a processing unit for providing scenario tree generation based at least in part upon scenario reduction, said computer readable program code logic, when executing, performing the following steps: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.





BRIEF DESCRIPTION OF THE DRAWINGS

Various objects, features and advantages of the present invention will become apparent to one skilled in the art, in view of the following detailed description taken in combination with the attached drawings, in which:



FIG. 1 depicts a graph comparing scenario tree generation methods according to embodiments of the present invention.



FIG. 2 depicts a block diagram of a system according to an embodiment of the present invention.



FIG. 3 depicts a flowchart according to an embodiment of the present invention.



FIG. 4 depicts a flowchart according to an embodiment of the present invention.





DETAILED DESCRIPTION

For the purposes of describing and claiming the present invention the term “scenario” is intended to refer to a postulated event, development or occurrence.


For the purposes of describing and claiming the present invention the term “scenario tree” is intended to refer to a tree-structured discrete probability distribution.


For the purposes of describing and claiming the present invention the term “forecast” is intended to refer to an estimation or calculation in advance.


For the purposes of describing and claiming the present invention the term “interperiod independency” is intended to refer to the following: if Ξ has interperiod independency property we have:







Pr


(

Ξ
=

(


ξ



i


1


,





,

ξ

i
T



)


)


=





t
=
1

T







Pr


(


Ξ
t

=

ξ

i
t



)



=




t
=
1

T








ρ



i


t


.







In one embodiment, a system for providing scenario tree generation based at least in part upon scenario reduction is provided, the system comprising one or more processor units configured for: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.


In one example, the step of generating comprises using Algorithm 1 (shown below) with nt values in the input determined by the optimization problem in equation 8 (shown below). This Algorithm reduces a tree-structured discrete probability distribution (what we also call “scenario tree”) to another tree-structured probability distribution with fewer scenarios.


In another example, the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}.


In another example:

Ξ={ξ}i=1Siεcustom characterT,


with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntεcustom character, t=1, . . . , T.


In another example:

{tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεcustom characterT,


with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T.


In another example, the algorithm comprises:













J

*
t






arg






min

n




(



ξ
~

i
t

,









,


ξ
~


n
t

t


)





n
t









i
=
1

s








min

k
=

1












n
t







p
i






ξ
i
t

-


ξ
~

k
t









,









t
=
1

,





,

T
.













(


i
.
e
.

,







J

*
t







is





the





set





of






n
t


-

medians





of







Ξ
t

.




)

















I
t



(

j
t

)




{



i
t



I
t


|


j
t





argmin

n



k
t



J

*
t













t
=
1

T










ξ

i
t


-


ξ
~


k
t









}


,










j
t



J

*
t



,

t
=
1

,









T
.








j



0

,










for





all






(


j
1

,





,

j
T


)





J

*
1


×

×

J

*
T







do















j


j
+
1


,











ξ
~

j

=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,










support






(

Ξ
~

)





support






(

Ξ
~

)




{


ξ
~

j

}



,






Pr


(


Ξ
~

=


ξ
~

j


)


=


q
j

=





i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)










Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)







,









end





for















S
~


j

,














D
K



(

Ξ
,

Ξ
~


)


=






i
=
1

S








min


j
=
1

,









,

S
~








t
=
1

T








p
i






ξ
i
t

-


ξ
~

j
t














=






i
=
1

S










t
=
1

T








p
i






ξ
i
t

-


ξ
~



j
t



(
i
)















=






t
=
1

T









M
t



(

n
t

)


.











In another example, the scenario tree is applied to a stochastic unit commitment problem.


In another example, the stochastic unit commitment problem relates to wind power generation.


In another example, each of the first forecast and the second forecast comprises a forecast of wind speed.


In another example, the steps are carried out in the order recited.


In another embodiment, a system for providing scenario tree generation based at least in part upon scenario reduction is provided, the system comprising one or more processor units configured for: receiving for each of a plurality of time periods a respective forecast, wherein the plurality of time periods comprise at least a first time period, at least a second time period and at least one intermediate time period, and wherein each forecast comprises a respective set of scenarios; for each time period in sequence, reducing the respective set of scenarios to a respective subset of scenarios, wherein each subset of scenarios comprises a subset of the respective set of scenarios; and generating a scenario tree based at least in part upon the subset of scenarios; wherein the scenario tree has interperiod independency.


In one example, the step of generating comprises using Algorithm 1 (shown below) with nt values in the input determined by the optimization problem in equation 8 (shown below). This Algorithm reduces a tree-structured discrete probability distribution (what we also call “scenario tree”) to another tree-structured probability distribution with fewer scenarios.


In another example, the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}.


In another example:

Ξ={ξ}i=1Siεcustom characterT,


with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntεcustom character, t=1, . . . , T.


In another example:

{tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεcustom characterT,


with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T.


In another example, the algorithm comprises:













J

*
t






argmin

n



(



ξ
~

1
t

,









,


ξ
~


n
t

t


)





n
t









i
=
1

s








min

k
=

1












n
t







p
i






ξ
i
t

-


ξ
~

k
t









,









t
=
1

,





,


T
.









(


i
.
e
.

,



J

*
t







is





the





set





of






n
t


-

medians





of






Ξ
t




)

.










I
t



(

j
t

)





{



i
t



I
t


|


j
t





argmin

n



k
t



J

*
t









t
=
1

T










ξ

i
t


-


ξ
~


k
t









}


,


j
t



J

*
t



,









t
=
1

,









T
.








j



0

,










for





all






(


j
1

,





,

j
T


)





J

*
1


×

×

J

*
T







do















j


j
+
1


,











ξ
~

j

=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,










support


(

Ξ
~

)





support


(

Ξ
~

)




{


ξ
~

j

}



,






Pr


(


Ξ
~

=


ξ
~

j


)


=


q
j

=





i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)










Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)







,









end





for















S
~


j

,














D
K



(

Ξ
,

Ξ
~


)


=






i
=
1

S








min


j
=
1

,









,

S
~








t
=
1

T








p
i






ξ
i
t

-


ξ
~

j
t














=






i
=
1

S










t
=
1

T








p
i






ξ
i
t

-


ξ
~



j
t



(
i
)















=






t
=
1

T









M
t



(

n
t

)


.











In another example, the scenario tree is applied to a stochastic unit commitment problem.


In another example, the stochastic unit commitment problem relates to wind power generation.


In another example, each of the forecasts comprises a forecast of wind speed.


In another example, the steps are carried out in the order recited.


In another embodiment, a method for providing scenario tree generation based at least in part upon scenario reduction is provided, the method comprising: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.


In one example, the step of generating comprises using Algorithm 1 (shown below) with nt values in the input determined by the optimization problem in equation 8 (shown below). This Algorithm reduces a tree-structured discrete probability distribution (what we also call “scenario tree”) to another tree-structured probability distribution with fewer scenarios.


In another example, the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}.


In another example:

Ξ={ξ}i=1Siεcustom characterT,


with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntεcustom character, t=1, . . . , T.


In another example:

{tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεcustom characterT,


with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T, and


wherein the algorithm comprises:













J

*
t






arg






min

n




(



ξ
~

i
t

,









,


ξ
~


n
t

t


)





n
t









i
=
1

s








min

k
=

1












n
t







p
i






ξ
i
t

-


ξ
~

k
t









,









t
=
1

,





,

T
.













(


i
.
e
.

,







J

*
t







is





the





set





of






n
t


-

medians





of







Ξ
t

.




)

















I
t



(

j
t

)




{



i
t



I
t


|


j
t





argmin

n



k
t



J

*
t













t
=
1

T










ξ

i
t


-


ξ
~


k
t









}


,










j
t



J

*
t



,

t
=
1

,









T
.








j



0

,










for





all






(


j
1

,





,

j
T


)





J

*
1


×

×

J

*
T







do















j


j
+
1


,











ξ
~

j

=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,










support






(

Ξ
~

)





support






(

Ξ
~

)




{


ξ
~

j

}



,






Pr


(


Ξ
~

=


ξ
~

j


)


=


q
j

=





i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)










Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)







,









end





for















S
~


j

,














D
K



(

Ξ
,

Ξ
~


)


=






i
=
1

S








min


j
=
1

,









,

S
~








t
=
1

T








p
i






ξ
i
t

-


ξ
~

j
t














=






i
=
1

S










t
=
1

T








p
i






ξ
i
t

-


ξ
~



j
t



(
i
)















=






t
=
1

T









M
t



(

n
t

)


.











In another example, the steps are carried out in the order recited.


In another embodiment, an article of manufacture is provided, comprising: at least one tangible computer usable medium having a computer readable program code logic tangibly embodied therein to execute at least one machine instruction in a processing unit for providing scenario tree generation based at least in part upon scenario reduction, said computer readable program code logic, when executing, performing the following steps: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios; reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios; receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios; reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; and generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios; wherein the scenario tree has interperiod independency.


In one example, the step of generating comprises using Algorithm 1 (shown below) with nt values in the input determined by the optimization problem in equation 8 (shown below). This Algorithm reduces a tree-structured discrete probability distribution (what we also call “scenario tree”) to another tree-structured probability distribution with fewer scenarios.


In another example, the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}.


In another example:

Ξ={ξ}i=1Siεcustom characterT,


with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntεcustom character, t=1, . . . , T,

{tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεcustom characterT,


with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T, and


wherein the algorithm comprises:













J

*
t






argmin

n



(



ξ
~

1
t

,









,


ξ
~


n
t

t


)





n
t









i
=
1

s








min

k
=

1












n
t







p
i






ξ
i
t

-


ξ
~

k
t









,









t
=
1

,





,


T
.









(


i
.
e
.

,



J

*
t







is





the





set





of






n
t


-

medians





of






Ξ
t




)

.










I
t



(

j
t

)





{



i
t



I
t


|


j
t





argmin

n



k
t



J

*
t









t
=
1

T










ξ

i
t


-


ξ
~


k
t









}


,


j
t



J

*
t



,









t
=
1

,









T
.








j



0

,










for





all






(


j
1

,





,

j
T


)





J

*
1


×

×

J

*
T







do















j


j
+
1


,











ξ
~

j

=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,










support


(

Ξ
~

)





support


(

Ξ
~

)




{


ξ
~

j

}



,






Pr


(


Ξ
~

=


ξ
~

j


)


=


q
j

=





i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)










Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)







,









end





for















S
~


j

,














D
K

(

Ξ
,

Ξ
~


)

=






i
=
1

S








min


j
=
1

,









,

S
~








t
=
1

T








p
i






ξ
i
t

-


ξ
~

j
t














=






i
=
1

S










t
=
1

T








p
i






ξ
i
t

-


ξ
~



j
t



(
i
)















=






t
=
1

T









M
t



(

n
t

)


.











In another example, the steps are carried out in the order recited.


In one embodiment, a method comprises the following: (a) first, forecasts from a forecast model are produced; (b) next, a scenario reduction technique is used to get a smaller yet representative set of scenarios. In one example, this two-step procedure is repeated for all time periods sequentially starting from the first time period going until the last time period. At a time period, the number of scenarios in the reduced set determines the number of children of that node in the scenario tree, and is given as further input to the method.


In another embodiment, if the forecast model has a closed form with necessary conditions, the two-step procedure is replaced by a single operation that analytically finds a set of scenarios closest to the forecast model.


As mentioned above, optimization under uncertainty has received the attention of many researchers in the last few decades (Sahinidis 2004). Many real-life problems under uncertainty can be modeled as a stochastic programming problem in the form:












min

x

X









𝔼
P



f


(

ω
,
x

)







min

x

X






Ω







f


(

ω
,
x

)




P


(


ω

)










,




(
1
)







where X⊂custom charactern is a given nonempty convex closed set, Ω a closed subset of custom characters and custom character the Borel σ-field relative to Ω, the function ƒ from Ω⊂custom charactern to the extended reals custom character is measurable with respect to ω and lower semi-continuous and convex with respect to x, and P is a fixed probability measure on (Ω, custom character). This formulation models two-stage and multi-stage stochastic programming problems with recourse, where X is the feasible set of decisions and ƒ(ω, x) evaluates the optimized decisions under the scenario that ω is realized.


Many issues concerning the formulation of model (1) and solution methodologies designed to solve it have to be addressed. Model (1) is usually a large-scale problem with multi-stage decisions, P is not known precisely, and ƒ is given implicitly as an iterated optimal value. One approach to decrease the size of model (1) and make it computationally tractable is to approximate the probability measure P by a discrete probability measure with a manageable number of probability atoms. Dupa{hacek over (c)}ovà et al. (2003) and Heitsch and Römisch (2003) consider a version of model (1) with a discrete probability measure P. The authors describe a methodology to reduce P to another discrete probability distribution with a finite support such that the Kantorovich distance between the two distributions is minimized. The authors show that model (1) remains stable to perturbations to its underlying probability distribution as long as the Kantorovich distance between the original distribution and the perturbed distribution is kept under control. Here the stability of a stochastic program means that its optimal value and the set of its ε-optimal solutions change continuously with perturbations to its underlying probability distribution.


Noting that a scenario tree is a discrete probability distribution, Gröve-Kuska and Römisch (2005, Chapter 30) proposes a heuristic algorithm that generates a scenario tree from a set of sample paths (i.e., possible realizations of a discrete probability distribution) using these scenario reduction techniques such that the Kantorovich distance between the set of sample paths and the set of scenarios in the scenario tree is minimized. The authors consider a general discrete probability distribution (scenario tree) and propose only a heuristic algorithm to reduce it. In this disclosure we assume more structure on the underlying probability distribution and provide (as one embodiment) an exact algorithm that reduces a discrete distribution to its closest distribution. We also prove a property of the reduced distribution that helps in reducing the computational effort of scenario reduction and in generating scenario trees from continuous distributions. We apply various embodiments to instances of the stochastic unit commitment problem.


We first develop a lower bound on the minimum Kantorovich distance that can be achieved by reducing a discrete probability distribution to an arbitrary discrete probability distribution with a cardinality restriction on the number of distinct realizations at each time period. We then focus on discrete probability distributions that have interperiod independency property, and develop an exact reduction algorithm that finds the reduced distribution closest to it. We show that the reduction can be performed at each time period of the original distribution independent of the other time periods. This decoupling property enables us to reduce the combinatorial-sized scenario tree with a much smaller computational effort. This also enables us to reduce continuous probability measures with interperiod independency property to much smaller discrete probability measures by successively applying Monte Carlo sampling and scenario reduction at each time period.


Reference will now be made to Scenario Tree Generation by Scenario Reduction. Let Ξ={ξi}i=1S be a T-dimensional discrete probability distribution with finite support, where ξiεcustom characterT, Pr(Ξ=ξi)=pi, pi≧0, i=1, . . . , S, Σi=1Spi=1. Let {tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)} be the reduced distribution with Pr({tilde over (Ξ)}={tilde over (ξ)}j)=qj, qj≧0, j=1, . . . , {tilde over (S)}, Σj=1Sqj=1. Let Ξt({tilde over (Ξ)}t) denote the marginal distribution of Ξ ({tilde over (Ξ)}) at tth time period, i.e.,








Pr


(


Ξ
t

=

ξ

i
t



)


=





i
=
1

S







Pr


(


Ξ
=

ξ
i


,


ξ
i
t

=

ξ

i
t




)



=

ρ

i
t




,






i
t



I
t


,

t
=
1

,





,
T
,






Pr


(



Ξ
~

t

=


ξ
~


j
t



)


=





j
=
1


S
~








Pr


(



Ξ
~

=


ξ
~

j


,



ξ
~

j
t

=


ξ
~


j
t




)



=

π

j
t




,






j
t



J
t


,

t
=
1

,





,
T
,





where It (Jt) is the support of Ξt ({tilde over (Ξ)}t) with cardinality mt (nt). We aim to find a lower bound on the minimum Kantorovich distance between Ξt and an arbitrary distribution {tilde over (Ξ)}t with the number of distinct realization at each time period at most nt, t=1, . . . , T.


Reference will now be made to the Lower Bound on the Minimum Kantorovich Distance. The Kantorovich distance between Ξ and {tilde over (Ξ)} reduces to the optimal value of a linear transportation problem:












D
K

(

Ξ
,

Ξ
~


)

=

inf


{






i
=
1

S










j
=
1


S
~









η
ij



c


(


ξ
i

,


ξ
~

j


)





:


η
ij


0


,





i
=
1

S







η
ij


=

q
j


,





j
=
1


S
~








η
ij


=

p
i


,


i

,


j


}



,




(
2
)








where c(ξi, {tilde over (ξ)}j)=Σt=1Tit−{tilde over (ξ)}jt|. To obtain a distribution {tilde over (Ξ)} that is closest to Ξ in terms of the Kantorovich distance, we use arbitrary probability weights, qj, for {tilde over (Ξ)}. The infimization in (2) becomes











D
K

(

Ξ
,

Ξ
~


)

=




i
=
1

S








p
i




min


j
=
1

,









,

S
~








t
=
1

T










ξ
i
t

-


ξ
~

j
t












(
3
)








with new probability weights,







I


(
j
)


=

{


i
=
1

,





,

S
|

j




argmin

n



j
=
1

,









,

S
~








t
=
1

T










ξ
i
t

-


ξ
~

j
t









}






where








q
j
*

=




i


I


(
j
)










p
i



,




The set argminn is a singleton subset of the set argminn obtained by, without loss of generality, deleting all elements of the set argmin but the smallest one in lexicographic order. To get a lower bound on the Kantorovich distance we change the order of summation and minimization in (3):














D
K



(

Ξ
,

Ξ
~


)


=







i
=
1

S




p
i




min


j
=
1

,





,

S
~








t
=
1

T






ξ
i
t

-


ξ
~

j
t






















i
=
1

S




p
i






t
=
1

T




min


j
=
1

,





,

S
~








ξ
i
t

-


ξ
~

j
t














=






t
=
1

T






i
=
1

S




min


j
=
1

,





,

S
~






p
i







ξ
i
t

-


ξ
~

j
t




.












(
4
)







To obtain a lower bound on the minimum Kantorovich distance, we use arbitrary realizations {tilde over (ξ)}j for the reduced distribution {tilde over (Ξ)}, with the restriction that the cardinality of the support of {tilde over (Ξ)}t is nt, t=1, . . . , T:













min


Ξ
~

=


{

ξ
~

}


j
=
1


S
~







D
K



(

Ξ
,

Ξ
~


)






min


Ξ
~

=


{

ξ
~

}


j
=
1


S
~









t
=
1

T






i
=
1

S




min


j
=
1

,





,

S
~






p
i






ξ
i
t

-


ξ
~

j
t










=




min



ξ
~

1
1

,





,


ξ
~

1
T





,





,




ξ
~


S
~

1

,





,


ξ
~


S
~

T









t
=
1

T






i
=
1

S




min


j
=
1

,





,

S
~






p
i






ξ
i
t

-


ξ
~

j
t









=





t
=
1

T




min



ξ
~

1
t

,





,


ξ
~


n
t

t








i
=
1

S




min


j
=
1

,





,

n
t






p
i






ξ
i
t

-


ξ
~

j
t









=




t
=
1

T




M
t



(

n
t

)






,




(
5
)








where Mt(1) is the median of the probability distribution Ξt, and, in general, Mt(k) is the k-median of Ξt. We note that if we allow {tilde over (Ξ)} to have as many realizations as Ξ, then the reduced distribution closest to Ξ trivially becomes Ξ itself. In this case, nt becomes equal to mt for t=1, . . . , T, and thus Mt(nt) becomes equal to zero. Thus, nt is a control parameter on the flexibility of choosing the reduced distribution with as many realizations as possible.


Reference will now be made to Discrete Distributions with Interperiod Independency Property. In this section we focus on discrete distributions with interperiod independency property, and give an algorithm that reduces these distributions to another such distribution. If Ξ has interperiod independency property we have







Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)


=





t
=
1

T



Pr


(


Ξ
t

=

ξ

i
t



)



=




t
=
1

T




ρ

i
t


.







We use Algorithm 1 (below) to reduce Ξ to {tilde over (Ξ)}. We show that the distance between Ξ and {tilde over (Ξ)} is equal to the lower bound provided by (5). Given i=1, . . . , S, and t=1, . . . , T, we let jt(i)εJ*t denote the smallest index such that |ξit−ξjt(i)t|≦|ξit−ξjtt|, ∀jtεJ*t. We let j(i)ε{i, . . . , {tilde over (S)}} such that {tilde over (ξ)}j(i)=({tilde over (ξ)}j1(i), . . . {tilde over (ξ)}jT(i)). We note that by construction of {tilde over (Ξ)}, such j(i) exists. Then,








min


j
=
1

,





,

S
~








t
=
1

T




p
i






ξ
i
t

-


ξ
~

j
t







=





t
=
1

T




p
i






ξ
i
t

-


ξ
~


j


(
i
)


t






=




t
=
1

T




p
i







ξ
i
t

-


ξ
~



j
t



(
i
)






.









Thus,


Algorithm 1, Construct {tilde over (Ξ)} from Ξ








Input


:






Ξ

=


{
ξ
}


i
=
1

S


,


ξ
i




T


,






with





marginal





distributions






Ξ
t


=


{

ξ

i
t


}



i
t



I
t




,

t
=
1

,





,
T
,






n
t




,

t
=

1












,



T
.




Output



:







Ξ
~


=


{

ξ
~

}


j
=
1


S
~



,



ξ
~

i




T


,






with





marginal





distributions







Ξ
~

t


=


{


ξ
~


j
t


}



j
t



j
t




,








J
t



=

n
t


,

t
=
1

,





,

T
.














J

*
t






argmin

n



(



ξ
~

1
t

,





,


ξ
~


n
t

t


)





n
t









i
=
1

s




min

k
=

1








n
t






p
i






ξ
i
t

-


ξ
~

k
t








,





t
=
1

,





,

T
.





(


i
.
e
.

,



J

*
t







is





the





set





of






n
t


-

medians





of







Ξ
t

.




)











I
t



(

j
t

)




{



i
t



I
t


|


j
t





argmin

n



k
t


ε






J

*
t









t
=
1

T






ξ

i
t


-


ξ
~


k
t









}


,






j
t



J

*
t



,

t
=
1

,









T
.




j



0

,






for





all






(


j
1

,





,

j
T


)





J

*
1


×

×

J

*
T







do









j


j
+
1


,







ξ
~

j

=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,






support


(

Ξ
~

)





support


(

Ξ
~

)




{


ξ
~

j

}



,






Pr


(


Ξ
~

=


ξ
~

j


)


=


q
j

=










i
1




I
1



(

j
1

)




















i
T




I
T



(

j
T

)






Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)






,






end





for






S
~



j

,










D
K



(

Ξ
,

Ξ
~


)


=






i
=
1

S




min


j
=
1

,





,

S
~








t
=
1

T




p
i






ξ
i
t

-


ξ
~

j
t














=






i
=
1

S






t
=
1

T




p
i






ξ
i
t

-


ξ
~



j
t



(
i
)















=






t
=
1

T





M
t



(

n
t

)


.










If, in addition, Ξ has interperiod independency property, we have














Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)


=






t
=
1

T



Pr


(


Ξ
t

=

ξ

i
t



)









=






t
=
1

T




ρ

i
t


.












Thus
,


for







ξ
~

j


=

(



ξ
~


j
1


,





,


ξ
~


j
T



)


,
















Pr


(


Ξ
~

=


ξ
~

j


)


=







i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)






Pr


(

Ξ
=

(


ξ

i
1


,





,

ξ

i
T



)


)











=







i
1




I
1



(

j
1

)





















i
T




I
T



(

j
T

)









t
=
1

T



ρ

i
t












=






t
=
1

T



(





i
t




I
t



(

j
t

)






ρ

i
t



)








=






t
=
1

T



Pr


(



Ξ
~

t

=


ξ
~


j
t



)















t
=
1

T




π

j
t


.









(
6
)







Hence, the reduced distribution has interperiod independency property as well.


Reference will now be made to a Scenario Tree with Minimum Distance. Given a discrete probability distribution with interperiod independency property, Algorithm 1 allows us to generate a scenario tree closest to it such that the reduced tree has interperiod independency property and has nt number of realizations at each time period. But, typically the size of the scenario tree is given by total number of scenarios n(=Πt=1Tnt), not by the number of realizations at each time period. We give an optimization problem that finds a scenario tree closest to the original distribution with interperiod independency property and with at most n scenarios. Let the binary decision variable zηt take value 1 if the scenario tree has η scenarios at time period t, and 0 otherwise. Consider the following minimization problem:











min
z






t
=
1

T






η
=
1

n





M
t



(
η
)




z
η
t













s
.
t
.




t
=
1

T



(




η
=
1

n



η






z
η
t



)




n

,









η
=
1

n



z
η
t


=
1

,

t
=
1

,





,
T
,






z
η
t



{

0
,
1

}


,

t
=
1

,







T

,

η
=
1

,





,

n
.






(
7
)







This optimization problem gives the parameters (nt) of the closest scenario tree with interperiod independency property and with at most n scenarios We can linearize this by noting that log function is increasing and by noting the binary nature of the z variables and the constraint Ση=1nzηt=1, t=1, . . . , T. That is,











t
=
1

T



(




η
=
1

n



η






z
η
t



)



n







t
=
1

T



log
(




η
=
1

n



η






z
η
t



)




log


(
n
)









t
=
1

T






η
=
1

n




log


(
η
)




z
η
t







log


(
n
)


.






And thus the linearized problem reads:











min
z






t
=
1

T






η
=
1

n





M
t



(
η
)




z
η
t













s
.
t
.








t
=
1

T






η
=
1

n




log


(
η
)




z
η
t







log


(
n
)



,









η
=
1

n



z
η
t


=
1

,

t
=
1

,





,
T
,






z
η
t



{

0
,
1

}


,

t
=
1

,





,
T
,

η
=
1

,





,

n
.






(
8
)







Reference will now be made to an Application to a Stochastic Unit Commitment Problem. This example unit commitment problem relates to finding an optimal up and down schedule and corresponding generation amounts for a set of generators over a twenty-four-hour horizon so that total cost of generation and transmission is minimized, and a set of constraints, such as demand requirements, upper and lower limits of generation, minimum up/down time limits, ramp up/down constraints, transmission constraints is observed. Unit commitment (UC) lies at the core of planning and operational decisions faced by independent system operators (ISOs), regional transmission organizations (RTOs), and utility companies. Hence it has received a good deal of attention in the industry.


The academic literature on the unit commitment problem dates back to the 1960s. An integer programming approach proposed by Dillon et al. (1978) was one of the earliest optimization-based approaches to the unit commitment problem. The authors address the unit commitment problem of hydro-thermal systems with reserve requirements. It was one of the earliest papers that can solve real life problems with 20 units. The authors developed two sets of valid inequalities that were globally valid to the problem and used these inequalities in the branch-and-bound algorithm. The dynamic programming approach developed by Snyder et al. [W. L. Snyder Jr., H. D. Powell, Jr, and J. C. Rayburn, “Dynamic programming approach to unit commitment”, IEEE Transactions Power Systems, 2 (1987), p. 339-347] was one of the earliest successful dynamic programming algorithms. The algorithm featured a classification of units so as to reduce the number of states. The authors addressed the problem at San Diego Gas & Electric System with 30 generators. Expert systems [Z. Ouyang, and S. M. Shahidehpour, “Short-term unit commitment expert system”, Electric Power Systems Research, 20 (1990), pp. 1-13], fuzzy logic [S. Saneifard, N. R. Prasad, and H. A. Smolleck, “A fuzzy logic approach to unit commitment”, IEEE Transactions on Power Systems, 12 (1997), pp. 988-995], meta-heuristic algorithms [A. H. Mantawy, Y. L. Abdel-Magid, and S. Z. Selim, “Integrating genetic algorithms, tabu search and simulated annealing for the unit commitment problem,” IEEE Transactions Power Systems, 14 (1999), pp. 829-836], and ant colony systems [S. J. Huang, “Enhancement of hydroelectric generation scheduling using ant colony system based optimization approaches”, IEEE Transactions on Energy Conservation, 16 (2001), pp. 296-301] are among the other approaches that have been applied to the unit commitment problem. Surveys by Sheble and Fand [G. B. Sheble, G. N. Fand, “Unit commitment—Literature synopsis”, IEEE Transactions on Power Systems, 9 (1994), pp. 128-135] and by Padhy [N. P. Padhy, “Unit commitment—A bibliographical survey”, IEEE Transactions on Power Systems, 19 (2004), pp. 1196-2005] review the academic literature on the unit commitment problem, and the book by Wood and Wollenberg [A. J. Wood and B. F. Wollenburg, “Power Generation Operation and Control”, John Wiley and Sons, New York, 1996] addresses several operational and planning problems in the energy industry, including the unit commitment problem. We first give a deterministic version of the unit commitment problem and then introduce the stochastic unit commitment problem:


Indices and Sets:


















i ε I
generators



t ε {1, . . . , T}
time periods











Data:


















Si(.)
startup cost function of unit i



Hi(.)
shutdown cost function of unit i



fi(.)
generation cose function of unit i



(Qi, qi)
maximum and minimum amounts for




units i's offer



(Ri, ri)
ramp-up and ramp-down amounts for




unit i



(Li, li)
minimum up and down times for unit i



dt
load at time period t











Decision Variables:


















git
generation provided by unit i at time period t



sit
binary variable indicating if unit i started at time




period t



uit
binary variable indicating if unit i is on at time




period t



ūit
vector of uit variables.




ūit ≡ (u), τ = t − li + 1, . . . , t. For τ < 0, uit is




set to the on/off state of unit i at time period τ.




u
it

vector of uit variables.





u
it ≡ (u), τ = t − Li + 1, . . . , t. For τ < 0, uit is





set to the on/off state of unit i at time period τ.











Formulation:











min

g
,
u
,
s







t
=
1

T






i

I





S
i



(


u
_

it

)





+


H
i



(


u
_

it

)


+


f
i



(

g
it

)






(

9

a

)








s
.
t
.








i

I




g
it





d
t


,

t
=
1

,





,
T
,




(

9

b

)









q
i



u
it




g
it




Q
i



u
it



,

i

I

,

t
=
1

,





,
T




(

9

c

)








s
it




u
it

-

u

i
,

t
-
1





,

i

I

,

t
=
1

,





,
T




(

9

d

)









g
it

-

g

i
,

t
-
1






R
i


,

i

I

,

t
=
1

,





,
T




(

9

e

)









g

i
,

t
-
1



-

g
it




r
i


,

i

I

,

t
=
1

,





,
T




(

9

f

)











τ
=

ma





x


{

1
,

t
-

L
i

+
1


}



t



s

i





τ





u
it


,

i

I

,

t
=
1

,





,
T




(

9

g

)











τ
=

t
+
1



m





i





n


{

T
,

t
+

l
i



}





s

i





τ





1
-

u
it



,

i

I

,

t
=
1

,





,
T




(

9

h

)








u
it



{

0
,
1

}


,


s
it



{

0
,
1

}


,


g
it


0

,

i

I

,

t
=
1

,





,
T




(

9

i

)







Objective function (9a) is the total cost of generation summed over all time periods. Cost of generation includes startup cost, shutdown cost, and fuel cost. Startup (shutdown) cost can be a step function that takes its highest step value when the unit has been down (up) for a certain number of time periods, or it can be a piecewise function that increases with the number of time periods the unit has been down (up). Fuel cost may be a nonlinear function of generation level, which may not be convex. Constraint (9b) requires total generation be greater than the load. Constraint (9c) has two functions: If the unit is down, it forces the generation to be zero; and, if the unit is up, it arranges the generation level to be between the upper and lower limits of the generator. Constraint (9d) links the start-up variables to up/down variables. Constraints (9e) and (9f) handle ramp-up and ramp-down limits. Constraint (9g) is for minimum up time and constraint (9h) is for minimum down time requirement. Finally constraint (9i) puts binary and nonnegativity requirements.


Minimum up/down time constraints are sometimes formulated as

uit−ui,t−1≦u,τε{t+1, . . . ,min{t+Li,T}},iεI,tεT  (10a)
ui,t−1−uit≦1−u,τε{t+1, . . . ,min{t+li,T}},iεI,tεT  (10b)


If the set of constraints (10) is used to model the minimum up/down time constraints, there is no need to introduce variable sit in model (9). Rajan and Takriti claims that constraints (9d), (9g), and (9h) model the minimum up/down time polytope.


We use a stochastic programming approach to address the uncertainty and intermittency of the wind power. We assume that the uncertainty evolves as a discrete time stochastic process with a finite probability space. We represent the information structure as a rooted scenario tree where the nodes n (nεN) in level t (tεT) of the scenario tree constitute the states of the world that can be distinguished by the information available up to time period t (Ahmed et al. 2003, Singh et al. 2009). The set of leaf nodes, NL (NL⊂N), contains the nodes without any successor. The root node is the node without any predecessor. In general, n(τ)εN represents the τth predecessor of node n. The level of the root node is zero, and in general the level of a node, tn, is defined such that n(tn) is the root node. The tree has a depth of T, and all leaf nodes have a level of T. By convention, n(0) is the node n itself, and n(τ), τ>tn, is an empty set.


The root node has an occurrence probability of one. For each node nεN, πn denotes the probability that the corresponding state of the world occurs given that its predecessor, n(1), has occurred; and pn denotes the unconditional probability that the corresponding state occurs, i.e., pnτ=0tn−1πn(τ). There is a one-to-one matching between the leaf nodes of the scenario tree and the scenarios. Given a leaf node, nεNL, a T-tuple [n(T),n(T−1), . . . , n(1), n] represents a scenario with probability of occurrence equal to pn. Two scenarios sharing the same state of the world at time periods 1, . . . , τ, for some τ<T, have to observe the same set of decision variables in the optimization model, in order to make sure that the model does not cheat by foreseeing (anticipating) the future. Using only a single set of decision variables for each node guarantees such a non-anticipativity property, and yet keeps the model size small as compared to using a separate set of decision variables for each scenario and for each time period and setting the variables equal to each other (Tahiti et al. 2000, Lulli and Sen 2004).


The stochastic unit commitment model is an extension of model (9), where nodes n in the scenario tree replace the time periods t in the deterministic model (9).


Formulation:











min

g
,
u
,
s







n

N





p
n






i

I





S
i



(


u
_


i





n


)






+


H
i



(


u
_


i





n


)


+


f
i



(

g

i





n


)






(

11

a

)








s
.
t
.








i

I




g

i





n






d
n


,

n

N

,




(

11

b

)









q
i



u

i





n





g

i





n





Q
i



u

i





n




,

i

I

,

n

N





(

11

c

)








s

i





n





u

i





n


-

u

i
,

n


(
1
)






,

i

I

,

n

N





(

11

d

)









g

i





n


-

g

i
,

n


(
1
)







R
i


,

i

I

,

n

N





(

11

e

)









g

i
,

n


(
1
)




-

g

i





n





r
i


,

i

I

,

n

N





(

11

f

)











τ
=
0



L
i

-
1




s

i
,

n


(
τ
)







u

i





n



,

i

I

,

n

N





(

11

g

)











τ
=
1



l
i

-
1




s

i
,

n


(
τ
)







1
-

u

i
,

n


(

l
i

)






,

i

I

,


n


:



n


(

l
i

)




N





(

11

h

)











τ
=
1



l
i

-
1
-
p




s

i
,

n


(
τ
)







1
-

u

i
,

n


(


l
i

-
p

)






,

i

I

,

n


N
L


,

p


{

1
,





,


l
i

-
1


}






(

11

i

)








u

i





n




{

0
,
1

}


,


s

i





n




{

0
,
1

}


,


g

i





n



0





(

11

j

)







We consider a simple wind model example with four time periods. Wind speed at each time period follows a Gaussian random variable with a mean of 10 miles per hour and a standard deviation of 2.5 miles per hour. We compare three approaches to generate a scenario tree. First approach samples five points from the Gaussian random variable for each time period and constructs a scenario tree with 625 scenarios (see “5-sample” in FIG. 1, graphed as square points). We solve the stochastic unit commitment problem using this tree. We repeat this process fifty times and take cumulative average of the optimal value of the problem. The second approach does similarly by sampling 3 scenarios per time period with a total of 81 scenarios (see “3-sample” in FIG. 1, graphed as diamond points). And the third approach uses the above scenario reduction technique. It first generates 100 samples from the Gaussian random variable, and then reduces it to 3 points. It generates a scenario tree for four time periods using these 3 points with a total of 81 scenarios (see “3-s-r” in FIG. 1, graphed as x points). FIG. 1 shows better convergence of the last approach, as it is actually reducing a scenario tree with 1003=1000000 scenarios to 33=81 scenarios, whereas the first approach uses only 625 scenarios and the second approach uses 81 scenarios. We also observe the better convergence of the first approach over the second, as it uses more scenarios.


Referring now to FIG. 2, this FIG. shows a hardware configuration of computing system 200 according to an embodiment of the present invention. As seen, this hardware configuration has at least one processor or central processing unit (CPU) 211. The CPUs 211 are interconnected via a system bus 212 to a random access memory (RAM) 214, read-only memory (ROM) 216, input/output (I/O) adapter 218 (for connecting peripheral devices such as disk units 221 and tape drives 240 to the bus 212), user interface adapter 222 (for connecting a keyboard 224, mouse 226, speaker 228, microphone 232, and/or other user interface device to the bus 212), a communications adapter 234 for connecting the system 200 to a data processing network, the Internet, an Intranet, a local area network (LAN), etc., and a display adapter 236 for connecting the bus 212 to a display device 238 and/or printer 239 (e.g., a digital printer or the like).


Referring now to FIG. 3, a flowchart according to an embodiment of the present invention is shown. As seen, in this example, step 301 is receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios. Further, step 303 is reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios. Further, step 305 is receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios. Further, step 307 is reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios. Further, step 309 is generating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios.


Referring now to FIG. 4, a flowchart according to an embodiment of the present invention is shown. As seen, in this example, step 401 is receiving for each of a plurality of time periods a respective forecast, wherein the plurality of time periods comprise at least a first time period, at least a second time period and at least one intermediate time period, and wherein each forecast comprises a respective set of scenarios. Further, step 403 is for each time period in sequence, reducing the respective set of scenarios to a respective subset of scenarios, wherein each subset of scenarios comprises a subset of the respective set of scenarios. Further, step 405 is generating a scenario tree based at least in part upon the subset of scenarios.


In another example, various embodiments may relate to manufacturing operations (e.g., industrial). In another example, various embodiments may relate to energy distribution and intelligent utility networks (IUN). In another example, various embodiments may relate to the “Smarter Planet” area. In another example, various embodiments may relate to operations and manufacturing applications (e.g., software).


In another example, various embodiments may provide an E&U specific solution to ILOG. In another example, various embodiments may provide a differentiating solution for E&U for the “SmartGrid” area.


In other examples, any steps described herein may be carried out in any appropriate desired order.


As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.


Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The containment (or storage) of the program may be non-transitory.


A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.


Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.


Computer program code for carrying out operations for aspects of the present invention may be written in any programming language or any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like or a procedural programming language, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).


Aspects of the present invention may be described herein with reference to flowchart illustrations and/or block diagrams of methods, systems and/or computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.


The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus or other devices provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.


It is noted that the foregoing has outlined some of the objects and embodiments of the present invention. This invention may be used for many applications. Thus, although the description is made for particular arrangements and methods, the intent and concept of the invention is suitable and applicable to other arrangements and applications. It will be clear to those skilled in the art that modifications to the disclosed embodiments can be effected without departing from the spirit and scope of the invention. The described embodiments ought to be construed to be merely illustrative of some of the features and applications of the invention. Other beneficial results can be realized by applying the disclosed invention in a different manner or modifying the invention in ways known to those familiar with the art. In addition, all of the examples disclosed herein are intended to be illustrative, and not restrictive.

Claims
  • 1. A system for providing scenario tree generation based at least in part upon scenario reduction, the system comprising one or more processor units configured for: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios;reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios;receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios;reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; andgenerating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios;wherein the scenario tree has interperiod independency;wherein the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}; andwherein the algorithm comprises:
  • 2. The system of claim 1, further comprising applying the scenario tree to a stochastic unit commitment problem.
  • 3. The system of claim 2, wherein the stochastic unit commitment problem relates to wind power generation.
  • 4. The system of claim 3, wherein each of the first forecast and the second forecast comprises a forecast of wind speed.
  • 5. The system of claim 1, wherein the steps are carried out in the order recited.
  • 6. The system of claim 2, further comprising generating energy at a set of generators based upon the application of the scenario tree to the stochastic unit commitment problem.
  • 7. The system of claim 6, further comprising generating the energy according to an optimal up and down schedule and corresponding generation amounts for the set of generators over a twenty-four-hour horizon.
  • 8. The system of claim 7, further comprising generating the energy according to the optimal up and down schedule and the corresponding generation amounts for the set of generators over the twenty-four-hour horizon so that a total cost of generation and transmission is minimized.
  • 9. The system of claim 8, further comprising generating the energy according to the optimal up and down schedule and the corresponding generation amounts for the set of generators over the twenty-four-hour horizon so that the total cost of generation and transmission is minimized and a set of constraints is observed.
  • 10. The system of claim 1, wherein: Ξ={ξ}i=1S,ξiεT,with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntε, t=1, . . . T.
  • 11. The system of claim 1, wherein: {tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεT,with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T.
  • 12. A system for providing scenario tree generation based at least in part upon scenario reduction, the system comprising one or more processor units configured for: receiving for each of a plurality of time periods a respective forecast, wherein the plurality of time periods comprise at least a first time period, at least a second time period and at least one intermediate time period, and wherein each forecast comprises a respective set of scenarios;for each time period in sequence, reducing the respective set of scenarios to a respective subset of scenarios, wherein each subset of scenarios comprises a subset of the respective set of scenarios; andgenerating a scenario tree based at least in part upon the subset of scenarios;wherein the scenario tree has interperiod independency;wherein the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)};wherein the algorithm comprises:
  • 13. The system of claim 12, further comprising applying the scenario tree to a stochastic unit commitment problem.
  • 14. The system of claim 13, wherein the stochastic unit commitment problem relates to wind power generation.
  • 15. The system of claim 14, wherein each of the forecasts comprises a forecast of wind speed.
  • 16. The system of claim 12, wherein the steps are carried out in the order recited.
  • 17. The system of claim 12, wherein: Ξ={ξ}i=1S,ξiεT,with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntε, t=1, . . . T.
  • 18. The system of claim 12, wherein: {tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεT,with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T.
  • 19. A method for providing scenario tree generation based at least in part upon scenario reduction, the method comprising: receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios;reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios;receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios;reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; andgenerating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios;wherein the scenario tree has interperiod independency;wherein the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}; and
  • 20. The method of claim 19, wherein the steps are carried out in the order recited.
  • 21. An article of manufacture, comprising: at least one non-transitory computer usable device having a computer readable program code logic tangibly embodied therein to execute at least one machine instruction in a processing unit for providing scenario tree generation based at least in part upon scenario reduction, said computer readable program code logic, when executing, performing the following steps:receiving for at least a first time period a first forecast, wherein the first forecast comprises a first set of scenarios;reducing the first set of scenarios to a first subset of scenarios, wherein the first subset of scenarios comprises a subset of the first set of scenarios;receiving for at least a second time period a second forecast, wherein the second forecast comprises a second set of scenarios;reducing the second set of scenarios to a second subset of scenarios, wherein the second subset of scenarios comprises a subset of the second set of scenarios; andgenerating a scenario tree based at least in part upon the first subset of scenarios and the second subset of scenarios;wherein the scenario tree has interperiod independency;wherein the steps of reducing comprise applying an input Ξ to an algorithm to produce an output {tilde over (Ξ)}; andwherein: Ξ={ξ}i=1S,ξiεT,with marginal distributions Ξt={ξit}itεIt, t=1, . . . , T, ntε, t=1, . . . , T, {tilde over (Ξ)}={{tilde over (ξ)}}j=1{tilde over (S)},{tilde over (ξ)}iεT,with marginal distributions {tilde over (Ξ)}t={{tilde over (ξ)}jt}jtεJt, |Jt|=nt, t=1, . . . , T, andwherein the algorithm comprises:
  • 22. The article of manufacture of claim 21, wherein the steps are carried out in the order recited.
US Referenced Citations (3)
Number Name Date Kind
6021402 Takriti Feb 2000 A
20060089864 Feng et al. Apr 2006 A1
20120122505 Dotzler et al. May 2012 A1
Non-Patent Literature Citations (30)
Entry
Kaut “Scenario tree generation for stochastic programming: Cases from finance”, doctoral thesis, 2003, pp. 140.
Ruiz et al. “Applying Stochastic Programming to the Unit Commitment Problem”, PMAPS 2008, pp. 6.
Sahinidis, N.V., “Optimization under uncertainty: state-of-the-art and opportunities”, Computers and Chemical Engineering, Jun. 2004, vol. 28, Issue 6-7, pp. 971-983.
Dupacova, J., et al., “Scenario reduction in stochastic programing: An approach using probability metrics”, Mathematical Programing, 2003, vol. 511, Issue 3, pp. 493-511.
Dillon, T., et al., “Integer programming approach to the problem of optimal unit commitment with probabilistic reserve determination”, IEEE Transactions Power Systems, Nov. 1978, pp. 2154-2166.
Saneifard, S., et al., “A fuzzy logic approach to unit commitment”, IEEE Transactions on Power Systems, May 1997, vol. 12, Issue 2, pp. 0885-8950.
Mantawy, A. H., et al., “Integrating genetic algorithms, tabu search and simulated annealing for the unit commitment problem,” IEEE Transactions Power Systems, Aug. 1999, vol. 14, Issue 3, pp. 829-836.
Huang, S.J., “Enhancement of hydroelectric generation scheduling using ant colony system based optimization approaches”, IEEE Transactions on Energy Conservation, Sep. 2001, vol. 16, No. 3, pp. 296-301.
Sheble, G.B., et al., “Unit commitment—Literature synopsis”, IEEE Transactions on Power Systems, Feb. 1994, vol. 9, Issue 1, pp. 128-135.
Padhy, N. P., “Unit commitment—A bibliographical survey”, IEEE Transactions on Power Systems, May 2004, vol. 19, Issue 2, pp. 1196-1205.
Wood, A. J., et al., “Power Generation Operation and Control”, Wiley-Interscience, Jan. 1996.
Lulli, G., et al., “A branch-and-price algorithm for multistage stochastic integer programming with application to stochastic batch-sizing problems”, Management Science, Jun. 2004, vol. 50, No. 6, pp. 786-796.
Takriti, S., et al., “Incorporating fuel constraints and electricity spot prices into the stochastic unit commitment problem”, Operations Research, Mar.-Apr. 2000, vol. 48, No. 2, pp. 268-280.
Singh, K. J., et al., “Dantzig-wolfe decomposition for solving multistage stochastic capacity-planning problems. Operations Research”, Sep.-Oct. 2009, vol. 57, No. 5, pp. 1271-1286.
Ljubic, I., “A Branch-Price-and-Cut Algorithm for Vertex-Biconnectivity Augmentation”, Networks, Oct. 2010, vol. 56, Issue 3, pp. 169-182.
Heitsch, H., et al., “Scenario Reduction Algorithms in Stochastic Programming”, Computational Optimization Applications, Feb.-Mar. 2003, vol. 24, Issue 2-3, pp. 1-21.
Growe-Kuska, N., et al., “Stochastic Unit Commitment in Hydrothermal Power Production Planning”, Applications of Stochastic Programming, 2005, pp. 633-653.
Snyder, W.L., et al., “Dynamic Programming Approach to Unit Commitment”, IEEE Transactions Power Systems, May 1987, vol. PWRS-2, Issue 2, pp. 339-348.
Ouyang, Z., et al.,“Short-Term Unit Commitment Expert System”, Electric Power Systems Research, Dec. 1990, vol. 20, Issue 1, pp. 1-13.
Ahmed, A., “A Multistage Stochastic Integer Programming Approach for Capacity Expansion Under Uncertainty”, May 2003, vol. 26, Issue 1, pp. 1-21.
Salam, S., “Unit Commitment Solution Methods”, Proceedings of World Academy of Science, Engineering and Technology, Dec. 2007, vol. 26, pp. 320-325.
Uyar, S., “Evolutionary Algorithms for the Unit Commitment Problem”, Turkish Journal of Electrical Engineering & Computer Sciences, Nov. 2008, vol. 16, No. 3, pp. 239-255.
Akella, M., et al., “Branch and Price: Column Generation for Solving Huge Integer Programs”, University at Buffalo, created Oct. 29, 2011, Department of Industrial Engineering,pp. 1-20.
Desrosiers, J., et al., “Branch-Price-and-Cut Algorithms”, Wiley Encyclopedia of Operations Research and Management Science, Apr. 2010, pp. 1-18.
Ladanyi, L., et al., “Software Tools for Implementing Branch, Cut, and Price Algorithums”, INFORMS Annual Meeting, Nov. 20, 2002, pp. 1-42.
Savelsbergh, M., “A Branch-and-Price Algorithm for the Generalized Assignment Problem”, Operations Research, Nov.-Dec. 1997, vol. 45, No. 6, pp. 1-23.
Ljubic, I., “A Branch-Price-and-Cut Algorithm for Vertex-Biconnectivity Augmentation”, Networks, Oct. 2010, vol. 56, Issue 3, pp. 1-29.
Villa et al., “A Column-Generation and Branch-and-Cut Approach to the Bandwidth-Packing Problem”, J. of Research of NIST, vol. 111, No. 2, 2006, pp. 161-185.
Parvania et al., “Reliability-Constrained Unit Commitment using Stochastic Mixed-Integer Programming”, IEEE 2010, pp. 200-205.
United States Office Action dated Sep. 10, 2014, received in related U.S. Appl. No. 13/414,044.
Related Publications (1)
Number Date Country
20130238530 A1 Sep 2013 US