MULTIVARIABLE OPTIMIZATION WITH CONSTRAINTS


1,831 marked this research material reliable.
Call or whatsapp: +2347063298784 or email: info@allprojectmaterials.com
Department: mathematics and statistics education project topics | Type: Project topics and materials | Format: Ms Word, PDF | Attribute: Documentation Only | Pages: 75 Pages | Chapters: 1-5 chapters

MULTIVARIABLE OPTIMIZATION WITH CONSTRAINTS

PROJECT TOPICS AND MATERIALS ON MULTIVARIABLE OPTIMIZATION WITH CONSTRAINTS


ABSTRACT

It has been proved that in non linear programming, there are five methods of solving multivariable optimization with constraints.

In this project, the usefulness of some of these methods (Kuhn – Tucker conditions and the Lagrange multipliers) as regards quadratic programming is unveiled.

Also, we found out how the other methods are used in solving constrained optimizations and all these are supported with examples to aid better understanding.

TABLE OF CONTENTS

Title Page                                                                                        i

Approval page                                                                                ii

Dedication                                                                                       iii

Acknowledgement                                                                         iv

Abstract                                                                                           v

Table of Contents                                                                            iv

CHAPTER ONE

1.0     Introduction                                                                           1

1.1     Basic definitions                                                                    3

1.2     Layout of work                                                                      6

CHAPTER TWO        

  • Introduction 9

2.1     Lagrange Multiplier Method                                                         9

2.2     Kuhn Tucker Conditions                                                      19

2.3     Sufficiency of the Kuhn-Tucker Conditions                        24

2.4     Kuhn Tucker Theorems                                                        30

2.5     Definitions – Maximum and minimum of a function                  34

2.6     Summary                                                                               38

CHAPTER THREE

  • Introduction 39

3.1     Newton Raphson Method                                                    39

3.2     Penalty Function                                                                   53

3.3     Method of Feasible Directions                                    57

3.4     Summary                                                                      67

CHAPTER FOUR

4.0     Introduction                                                                 68

4.1     Definition – Quadratic Programming                       69

4.2     General Quadratic Problems                                               70

4.3     Methods                                                                        75

4.4     Ways/Procedures of Obtaining the optimal

Solution from the Kuhn-Tucker Conditions

method                                                                         76

  • The Two-Phase Method 76
  • The Elimination Method 77

4.5     Summary                                                                      117

CHAPTER FIVE

Conclusion                                                                                       118

References                                                                              120

CHAPTER ONE

  • INTRODUCTION

There are two types of optimization problems:

Type 1

Minimize or maximize         Z = f(x)                          (1)

XE Rn

Type 2

Minimize or maximize         Z  =  f(x)                        (2)

Subject to     g(x)  ~ bi,  i, = 1, 2, —–, m   (3)

where x E Rn

and for each i, ~ can be either <, > or =.

Type 1 is called unconstrained optimization problem.  It has an objective function without constraints. The methods used in solving such problem are the direct search methods and the gradient method (steepest ascent method).

In this project, we shall be concerned with optimization problems with constraints.

The type 2 is called the constrained optimization problem.  It has an objective function and constraints.  The constraints can either be equality (=) or inequality constraints (<, >).

Moreover, in optimization problems with inequality constraints, the non-negativity conditions, X >0 are part of the constraints.

Also, at least one of the functions f(x) and g(x) is non linear and all the functions are continuously differentiable.

There are five methods of solving the constrained multivariable optimization.  These are:

  1. The Lagrange multiplier method.
  2. The Kuhn-Tucker conditions
  3. Gradient methods
    1. Newton-Raphson method
    2. Penalty function
  4. Method of feasible directions.

The Lagrange multiplier method is used in solving optimization problems with equality constraints, while the Kuhn-Tucker conditions are used in solving optimization problems with inequality constraints, though they play a major role in a type of constrained multivariable optimization called “Quadratic programming”.

The gradient methods include:

The Newton-Raphson method and the penalty function.  They are used in solving optimization problems with equality constraints while the method of feasible directions are used in solving problems with inequality constraints.

BASIC DEFINITIONS

  1. NEGATIVE DEFINITE:

The quadratic form XT Rx is negative definite if (-1)i+1 Ri<0, i = 1(1)m.

Using (-1)i+1 Ri<0.

When i = 1  à  (-12 R1 <0  à R1 < 0

i = 2 à (-1)3 R2 < 0  à  R2 < 0: R2 > 0

i = 3 à (-1)4 R3 < 0   à  R3 < 0

R1 < 0, R2 > 0, R3 < 0, R4 > 0, ——-

  1. NEGATIVE SEMI-DEFINITE

The quadratic form XT Rx is negative semi-definite if (-1)i+1 Ri < 0 and at least one (-1)i+1 Ri ¹ 0

  1. POSITIVE DEFINITE

The quadratic form XT Rx is positive definite if Ri > 0, i = 1 (1)m.

                   Example:

R  =            r11      r12      r13  – – – – – – –        r1m

                             r21      r22      r23  – – – – – – –        r2m

                             r31      r32      r33  – – – – – – –        r3m

 

                             rm1     rm2     rm3  – – – – – – –       rmm

where

R1  =    r11     > 0

R2  =

r11      r12      > 0

r21      r22

 

  1. POSITIVE SEMI DEFINITE

The quadratic form XT Rx is positive semi definite if Ri > 0, i = 1 (1)m and at least one Ri ¹ 0

  1. CONVEX

The function f is convex if the matrix R positive definite.  Example is f(x).

  1. CONCAVE

A function f is said to be concave if its negative is convex.  Example is   -f (x).

NOTE:

Whether the objective function is convex or concave, it means the matrix is positive definite or negative definite.  When the matrix is positive definite or positive semi-definite, it should be minimized, but when it is negative definite or negative semi-definite, then it should be maximized.

LAYOUT OF WORK

There are five chapters in this project.

Chapter two is dedicated to two methods of solving constrained optimization.  These methods are the Lagrange multiplier method and the Kuhn-Tucker conditions.  This section clearly shows how the Kuhn-Tucker conditions are derived from the Lagrange multiplier method, in an optimization problem with inequality constraints.  As part of this chapter, the global maximum, local maximum and the global minimum of an optimization problem was also derived.

Chapter three presents the gradient methods and the method of feasible directions.  The gradient methods are the Newton Raphson method and the penalty function.

The gradient methods are used in solving optimization problems with equality constraints while the method of feasible directions is used in solving optimization problems with inequality constraints.

Chapter four is specifically on a type of multivariable optimization with constraints.  This is called “Quadratic programming”.  This chapter comprises of quadratic forms, general quadratic problems and it shows the importance of two methods called the Lagrange multiplier method and the Kuhn-Tucker conditions.  This section explains how we can arrive at an optimal solution through two different methods after the Kuhn-Tucker conditions have been formed.  These are the two-phase method and the elimination method.

Chapter 5 is the concluding part of this project.

Each chapter starts with an introduction that facilitates the understanding of the section and also contains useful examples.

In conclusion, this research will make us understand the different methods of solving constrained optimization and how some of these methods are applied in quadratic programming.

DOWNLOAD COMPLETE WORK FOR MULTIVARIABLE OPTIMIZATION WITH CONSTRAINTS


Additional Information

  • The Project Material is available for download.
  • The Research material is delivered within 15-30 Minutes.
  • The Material is complete from Preliminary Pages to References.
  • Well Researched and Approved for supervision.
  • Click the download button below to get the complete project material.

Frequently Asked Questions

In-order to give you the best service available online, we have compiled frequently asked questions (FAQ) from our clients so as to answer them and make your visit much more interesting.

We are proudly Nigerians, and we are well aware of fraudulent activities that has been ongoing in the internet. To make it well known to our customers, we are geniune and duely registered with the Corporate Affairs Commission of the republic of Nigeria. Remember, Fraudulent sites can NEVER post bank accounts or contact address which contains personal information. Free chapter One is always given on the site to prove to you that we have the material. If you are unable to view the free chapter 1 send an email to info@researchcub.info with the subject head "FREE CHAPTER 1' plus the topic. You will get a free chapter 1 within an hour. You can also check out what our happy clients have to say.


Students are always advised to use our materials as guide. However, if you have a different case study, you may need to consult one of our professional writers to help you with that. Depending on similarity of the organization/industry you may modify if you wish.


We have professional writers in various disciplines. If you have a fresh topic, just click Hire a Writer or click here to fill the form and one of our writers will contact you shortly.


Yes it is a complete research project. We ensure that our client receives complete project materials which includes chapters 1-5, full references, questionnaires/secondary data, etc.


Depending on how fast your request is acknowledged by us, you will get the complete project material withing 15-30 minutes. However, on a very good day you can still get it within 5 minutes!

What Clients Say

Our Researchers are happy, see what they are saying. Share your own experience with the world.
Be polite and honest, as we seek to expand our business and reach more people. Thank you.

A Research proposal for multivariable optimization with constraints:
Reviews: A Review on multivariable optimization with constraints, multivariable, optimization, with project topics, researchcub.info, project topic, list of project topics, research project topics, journals, books, Academic writer.
It has been proved that in non linear programming, there are five methods of solving multivariable optimization with constraints. In this project, the usefulness of some of these methods (Kuhn – Tucker conditions and the Lagrange multipliers) as regards quadratic programming is unveiled. Also, we found out how the other methods are used in solving constrained optimizations and all these are supported with examples to aid better understanding. .. mathematics and statistics education project topics

MULTIVARIABLE OPTIMIZATION WITH CONSTRAINTS

Project Information

Share Links

Download Post (MsWord)
Download Post (PDF)

Search for Project Topics

Project topics in Departments

Do you need a writer for your academic work?