In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and d...In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to informa- tion transmission rate approaches to zero. All these unified and rigorous theoretical for- mulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.展开更多
A virtual synchronous generator(VSG)can provide inertial support through renewables and energy storage.It generally operates in parallel with a diesel generator(DSG)in an islanded microgrid.However,unforeseen interact...A virtual synchronous generator(VSG)can provide inertial support through renewables and energy storage.It generally operates in parallel with a diesel generator(DSG)in an islanded microgrid.However,unforeseen interactive power oscillations occur in the paralleled system when loads fluctuate.These may also burn out the VSG owing to its low overcurrent capacity.The mechanism and suppression strategy of the power oscillation of a VSG-DSG paralleled system are investigated.It reveals that the interactive power oscillation is caused essentially by the physical difference and parameter mismatch between the VSG and DSG.Then,the elimination condition of oscillation generation is derived.Subsequently,a comprehensive suppression control strategy based on virtual inductance and dynamic mutual damping technology is proposed.Finally,the experimental results verify the effectiveness of the proposed method.展开更多
文摘In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to informa- tion transmission rate approaches to zero. All these unified and rigorous theoretical for- mulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.
基金Supported by the Science and Technology Project of China Southern Power Grid(ZBKJXM20180211).
文摘A virtual synchronous generator(VSG)can provide inertial support through renewables and energy storage.It generally operates in parallel with a diesel generator(DSG)in an islanded microgrid.However,unforeseen interactive power oscillations occur in the paralleled system when loads fluctuate.These may also burn out the VSG owing to its low overcurrent capacity.The mechanism and suppression strategy of the power oscillation of a VSG-DSG paralleled system are investigated.It reveals that the interactive power oscillation is caused essentially by the physical difference and parameter mismatch between the VSG and DSG.Then,the elimination condition of oscillation generation is derived.Subsequently,a comprehensive suppression control strategy based on virtual inductance and dynamic mutual damping technology is proposed.Finally,the experimental results verify the effectiveness of the proposed method.