Programa de Pós-Graduação em Geofísica - CPGF/IG
URI Permanente desta comunidadehttps://repositorio.ufpa.br/handle/2011/2355
O Programa de Pós-Graduação em Geofísica da UFPA (CPGF) do Instituto de Geociências (IG) da Universidade Federal do Pará (UFPA). Foi o segundo no Brasil a formar recursos humanos em Geofísica em nível de pós-graduação stricto sensu. Criado em 1972, funcionou até 1992 junto com os Cursos de Pós-Graduação em Geoquímica e Geologia.
Navegar
Navegando Programa de Pós-Graduação em Geofísica - CPGF/IG por Agência de fomento "ANP - Agência Nacional do Petróleo"
Agora exibindo 1 - 20 de 22
- Resultados por página
- Opções de Ordenação
Item Acesso aberto (Open Access) Análise das aproximações RPP e RSP para meios isotrópicos(Universidade Federal do Pará, 2002-08-16) SANTOS, Darcicléa Ferreira; PROTÁZIO, João dos Santos; http://lattes.cnpq.br/4210442535067685This work presents linear and quadratic approximations of the Zoeppritz equations for the derivation of reflection and transmission coefficients from P-P and P-S events as a function of incidence angles and angular average, as well as the linear inversion analysis, AVO, in respect to the disassociated and combined P-P and P-S reflection events. The use of the so-called pseudoquadratic approximations was applied for the derivation of quadratic approximations only for PPevents, around the average contrasts of compressional and shear waves velocities and Vs/Vp ratio. The results of approximations derived in this work show that quadratic approximations were more precise than the linear ones in the two angular versions. The comparisons between these approximations in terms incidence angle and angular average show that the quadratic approximations are equivalent within the angular limit of [0º to 30º]. In the other hand, the linear approximations as a function of incidence angle are more precise than the linear approximations as a function of the angular average. In the linear inversion, sensitivity and ambiguity analyses were carried out and one could see that in the case of disassociated reflection P-P and P-S events, just a parameter can be estimated and the combination of these events can stabilize the inversion permitting the estimation of two physical parameters for the media (impedance, P-wave velocity and shear bulk module contrasts).Item Acesso aberto (Open Access) Análise de sensibilidade para estereotomografia em meios elípticos e anelípticos(Universidade Federal do Pará, 2005-12) BARBOSA, Brenda Silvana de Souza; COSTA, Jessé Carvalho; http://lattes.cnpq.br/7294174204296739Stereotomography is extended to general anisotropic models and implemented for elliptical and anelliptical anisotropy. The elliptical and anelliptical models present only three parameters. This makes them less sensitive to the ambiguity due to limited coverage of surface seismic experiments than transversaly isotropic or orthorhombic models. The corresponding approximations of the slowness surface restrict the validity of the present approach to qP events and mild anisotropy. Numerical experiments show the potential and the limitations of stereotomography in estimating macro-velocity models suitable for imaging in the presence of anisotropy as well as the importance of transmission events from multiple-offset VSP experiments for the success of the approach.Item Acesso aberto (Open Access) Análise teórica do problema de Weaver da falha infinita, modo TE magnetotelúrico(Universidade Federal do Pará, 2003-02-14) GUIMARÃES, Raimundo Nonato Menezes; RIJO, Luiz; http://lattes.cnpq.br/3148365912720676In this work it is shown an analytic solution for the Magnetotelluric TE mode infinite fault, taking in consideration the presence of the air. The solution following the hybrid solution, partially analytic and partially numeric, proposed in 1985 by Sampaio. In his solution he applied eight boundary conditions. We found that four of them are mathematically inconsistent and had to be modified. The modification of them took us to the analytic solution discussed here. This solution is compared with those obtained by Weaver and by Sampaio and with finite element method, using resistivity contrast equal to 2, 10 and 50 between the two sides of the fault. As a result, the analytic solution obtained here for the normalized electric field shows a better fit with the finite element solution then those proposed by either Weaver or Sampaio. This is a very difficult problem and it is still open to a definitive analytic solution. The one shown here is just one big step toward this goal.Item Acesso aberto (Open Access) Aplicação de modelos de substituição de fluido em rochas sedimentares oriundas do nordeste brasileiro(Universidade Federal do Pará, 2015-06-06) TROVÃO, Ana Alzira Fayal; FIGUEIREDO, José Jadsom Sampaio de; http://lattes.cnpq.br/1610827269025210Carbonates reservoirs corresponds on about 50% of the hydrocarbon reservoir in the planet . This type of lithology presents different forms of heterogeneity, which are the main causes of errors in its characterization. This misunderstanding, can induces erroneous estimative elastic modules of rocks in saturated state. The main goal of this work is to perform a comparative analysis of fluid substitution models in unconventional carbonate reservoir. Specifically, fluid substitution processes analyzed in outcrops from Brazilian Northeast, under controlled laboratory conditions (temperature, pressure and degree of saturation) and under perspectives of the petrophysical and ultrasonic features by conventional theories (Gassmann, Biot) and unconventional (Brown and Korringa, Muller and Sahay). In this research, we analyzed six samples of carbonate rock and one sample of sandstone rock. The input data our analysis were: permeability, porosity, rock and grain density, elastic measures of compressional (Vp) and shear (Vs1 and Vs2) velocities. The measure of velocities was performed in cases of 100% gas (dry rock) and then replaced by water (100 % saturated by water). Our results show, that predictions performed by conventional fluid substitution models best fit in experimental measurements of sample considered homogenous. However, predictions performed by unconventional models (e. g., Muller and Sahay) shown best fit with most carbonates types, including tufa and limestanes.Item Acesso aberto (Open Access) Atenuação de múltiplas pelo método WHLP-CRS(Universidade Federal do Pará, 2003-01-28) ALVES, Fábio José da Costa; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617In the sedimentary basins of the Amazon region, the generation and accumulation of hydrocarbons is related to the presence of diabase sills. These rocks present a great impedance contrast to the host rocks what turns to cause the generation of internal and external multiples with similar amplitudes the primary events. These multiples can predominate over the information originated at the deeper interfaces, making more difficult the processing, interpretation and imaging of the seismic section. In the present research work, we conducted de multiple attenuation in synthetic commonsource (CS) seismic sections by combining the Wiener-Hopf-Levinson for prediction (WHLP) and the common-reflection-surface-stack (CRS) methods. We denominated this new combination under the name and label of WHLP-CRS method. The deconvolution operator is calculated from the real amplitudes of the seismic section trace-by-trace, and this strategy represents efficiency in the process of multiples attenuation. Multiples identification is carried out in the zero-offset (ZO) section simulated by the CRS-stack applying the periodicity criteria between the primary and its repeated multiples. The wavefront attributes, obtained by the CRS-stack, are employed to move the shifting windows in the timespace domain, and these windows are used to calculate the WHLP-CRS operator for the multiple attenuation carried out in the CS sections. The development of the present research had several intentions as: (first) avoid the inconveniencies of the processed ZO section; (second) design and apply operators in the CS configuration; (third) extend the WHL method to curved interface; (fourth) use the good results obtained in the new CRS-stack technology whose application extends to migration, tomography, inversion and AVO.Item Acesso aberto (Open Access) Caracterização de reservatórios fraturados através de dados de ondas qP em levantamentos VSP Walkaway(Universidade Federal do Pará, 2008) SILVA, Saulo da Costa e; GOMES, Ellen de Nazaré Souza; http://lattes.cnpq.br/1134403454849660This dissertation presents a method to estimate the orientation of the axis of symmetry of a medium assuming it behaves effectively as a transversely isotropic medium (TI). The fracture orientation is then achieved from the estimation of the axis of symmetry of a TI medium. This estimative is done with the slowness and polarization vectors from qP waves, measured in VSP Walkaway experiments. The inversion process is based on linearized equations of the slowness and polarization vectors from qP waves and the weakly anisotropic parameters of the medium. Numerical tests are presented, on which the sensibility to factors like the strength of anisotropy, survey geometry, type of wave utilized and noise level are analyzed. Test results for a set of real data are also shown.Item Acesso aberto (Open Access) Deconvolução de perfis de poço através de rede neural recorrente(Universidade Federal do Pará, 2006-03-05) RUÉLA, Aldenize de Lima; ANDRADE, André José Neves; http://lattes.cnpq.br/8388930487104926For oil industry, the logs analysis is the main information source about the presence and quantification of hydrocarbon in subsurface. However, in two situations the new logging technologies are not economically viable and conventional logging tools must be used: The reevaluation of mature oil fields and evaluation of marginal oil fields. In conventional logs its data acquisition procedure may blur the value of physical property and the vertical limits of a rock layer. We are talking about an old problem in well logging – The paradox between vertical resolution and depth of investigation of a logging tool. Nowadays it is well handling by the high technology of new tools, but this problem persists in conventional old tools, e.g. natural gamma ray log (GR). Here, we present a method to smooth this kind of linear distortion in well logs by an integration of classical well log convolution model with recurrent neural networks. We assume that a well log can be well represented by an in depth convolution operation between the variation of rock physical property (ideal log) and a function that causes the distortion, called as vertical tool response. Thus, we develop an iterative data processing, which acts as a deconvolution operation, composed by three recurrent neural networks. The first one seeks to estimate the vertical tool response; the second one search for the vertical limits definition of each rock layer and the last one is constructed to estimate the actual physical property. To start this process we supply an appropriated first guess of ideal log and vertical tool response. Finally, we show the improvements in vertical resolution and in the physical property evaluation produced by this methodology in synthetic logs and actual well log data from Lagunillas formation, Maracaibo basin, Venezuela.Item Acesso aberto (Open Access) Empilhamento sísmico por superfície de reflexão comum: um novo algoritmo usando otimização global e local(Universidade Federal do Pará, 2001-10-25) GARABITO CALLAPINO, German; CRUZ, João Carlos Ribeiro; http://lattes.cnpq.br/8498743497664023; HUBRAL, Peter; http://lattes.cnpq.br/7703430139551941By using an arbitrary source-receiver configuration and without knowledge of the velocity model the recently introduced seismic data stacking method called Common Reflection Surface (CRS) simulates a zero-offset (ZO) section from multi-coverage seismic reflection data. For 2-D acquisition, as by-products provides three normal ray parameters: 1) the emergence angle (β0); 2) the radius of curvature of the Normal Incidence Point Wave (RNIP); and 3) the radius of curvature of the Normal Wave (RN). The CRS stack is based on the hyperbolic traveltime paraxial approximation depending on β0, RNIP and RN. In this thesis is presented a new algorithm of the CRS stack based on two-parameters and one-parameter search strategy combining global and local optimization methods for determine the three parameters that define the stacking surface (or operator). This is performed in three steps: 1) two-parameters search by applying global optimization to determine β0 and RNIP; 2) one-parameter search by applying global optimization to determine RN; and 3) three-parameters search by applying local optimization to determine three parameters, using as initial approximations the parameter triple of the earlier two steps. In the first two steps is used the Simulated Annealing (SA) algorithm and the Variable Metric algorithm is used in the third step. To simulate the conflicting dip events, for each ZO sample where there are interference of intersecting events is determined an additional parameter triple corresponding to a local minimum. The stacking along the respective operator for each particular event allows to simulate their interference in the simulated ZO section by means of their superposition. This new CRS stack algoritm was applied to synthetic data sets providing high-quality simulated ZO sections and high precision determination of the stack parameters in comparison with the forward modeling. Using the hyperbolic traveltime approximation for identical radii of curvature RNIP = RN, an algorithm called Common Diffraction Surface (CDS) stack was developed to simulate ZO sections for diffracted waves. In a similar way to the CRS stack procedure, this new algorithm also uses the SA and VM optimization methods to determine the optimal parameter couple (β0, RNIP) that define the best CDS operator. The main features of the algorithm are the data normalization, common-offset data, large aperture of the CDS operator and positive search space for RNIP. The application of the CDS stack algorithm in a synthetic dataset containing reflected and diffracted wavefields provides as main result a simulated ZO section containing diffracted events clearly defined. The post-stack depth migration of the ZO section locates correctly the discontinuities of the second interface.Item Acesso aberto (Open Access) Inversão de dados eletromagnéticos com o regularizador Variação Total e o uso da matriz de sensibilidade aproximada(Universidade Federal do Pará, 2012-12-20) LUZ, Edelson da Cruz; RÉGIS, Cícero Roberto Teixeira; http://lattes.cnpq.br/7340569532034401Item Acesso aberto (Open Access) Medidas de coerência para análise de velocidade na migração em tempo(Universidade Federal do Pará, 2011) MACIEL, Jonathas da Silva; COSTA, Jessé Carvalho; http://lattes.cnpq.br/7294174204296739Iterative methods for migration velocity analysis depend on objective functions to measure the flatness of reflection events in common image gathers (CIG). Time migration is a simple imaging method to evaluate these objective functions. Using time migration we studied the influence of objective functions on the results of migration velocity analysis. We propose two new objective functions for migration velocity analysis: Extended Differential Semblance and the product of Classical Semblance times the Extended Differential Semblance. Numerical experiments using the Marmousoft data show the effectiveness of the new objective functions to estimate velocity models producing at events in common image gathers.Item Acesso aberto (Open Access) Migração em profundidade pré-empilhamento utilizando os atributos cinemáticos do empilhamento por superfície de reflexão comum(Universidade Federal do Pará, 2007-11-12) LUZ, Samuel Levi Freitas da; CRUZ, João Carlos Ribeiro; http://lattes.cnpq.br/8498743497664023The Common-Reflection-Surface (CRS) stack is a new seismic processing method for simulating zero-offset (ZO) and common-offset (CO) sections. It is based on a second-order hyperbolic paraxial approximation of reflection traveltimes in the vicinity of a central ray. For ZO section simulation the central ray is a normal ray, while for CO section simulation the central ray is a finite-offset ray. In addition to the ZO section, the CRS stack method also provides estimates of wavefield kinematic attributes useful for solving interval velocity inversion, geometrical spreading calculation, Fresnel zone estimate, and also diffraction events simulation. In this work, Its proposed a new strategy to do a pre-stack depth migration by using the CRS derived wavefield kinematic attributes, so-called CRS based pre-stack depth migration (CRS-PSDM) method. The CRS-PSDM method uses the CRS results (ZO section and kinematic attributes) to construct an optimized stack traveltime surface along which the amplitudes of the multi-coverage seismic data are to be summed and the result is put in a point of the migration target zone in depth. In the same sense as in Kirchhoff type pre-stack depth migration (K-PSDM), the CRSPSDM method needs a migration velocity model. Unlike the K-PSDM method, the CRS-PSDM needs only to calculate the zero-offset traveltimes, i.e, along only ray conecting the considered point in depth to a given coincident position of source-receptor at surface. The final result is a zero-offset time-to-depth converted seismic image of reflectors from pre-stack seismic data.Item Acesso aberto (Open Access) Migração em profundidade usando a solução numérica da equação da eiconal(Universidade Federal do Pará, 2001-06-12) LUZ, Samuel Levi Freitas da; CRUZ, João Carlos Ribeiro; http://lattes.cnpq.br/8498743497664023In the last years we have seen an increasing interest in seismic imaging algorithms in order to obtain better informations about the earth interior. The Kirchhoff migration method is very useful for determining the position of seismic reflectors, if is known the seismic wave velocity model and the traveltimes are well determined through the earth model. The traveltime calculation is a necessary step for stacking the seismic data by means of the Kirchhoff migration operator. In this work the traveltimes are obtained by solving the eiconal equation of the ray theory. At first, the theory of Kirchhoff migration is reviewed, by considering depth migration in heterogeneous media with arbitrary curved reflectors. Secondly, the numerical solution of the eiconal equation is presented for transmited, diffracted and head waves. There offer, the depth migration algorithm is presented, must makes use of traveltimes obtained by the eiconal equation. Finally, the developed migration algorithm is applied to synthetic models, providing a very good image resolution in comparison with the conventional ray tracing migration methods, even in the presence of random or coherent (multiple reflections) noise.Item Acesso aberto (Open Access) Modelagem computacional de dados magnetotelúricos marinhos 2-D(Universidade Federal do Pará, 2009) SAITO, Kymie Karina Silva; SILVA, Marcos Welby Correa; http://lattes.cnpq.br/3213216758254128This study aims to investigate the scattering of plane waves caused by lateral variation of the physical properties of rocks. This is one of the most important to the success of exploration geophysics. The geophysical method used in this dissertation was magnetotelluric method (MT) and Marine Magnetotelluric (MMT). The tool used here was the finite element method, which is efficient to solve numerically the differential equations for electromagnetic fields of the geological structures with complex geometries. The computational procedures were used in the development and implementation of algorithms of numerical modeling of electromagnetic data. These algorithms were developed and implemented in several models of different geoelectrical parameters.Item Acesso aberto (Open Access) Modelagem eletromagnética 2.5-D de dados geofísicos através do método de diferenças finitas com malhas não-estruturadas(Universidade Federal do Pará, 2014-10-23) MIRANDA, Diego da Costa; RÉGIS, Cícero Roberto Teixeira; http://lattes.cnpq.br/7340569532034401; HOWARD JUNIOR, Allen Quentin; http://lattes.cnpq.br/6447166738854045We present a 2.5D electromagnetic formulation for modelling of the marine controlledsource electromagnetic (mCSEM) using a Finite Diference frequency domain (FDFD) method. The formulation is in terms of secondary fields thus removing the source point singularities. The components of the electromagnetic field are derived from the solution of the magnetic vector potential and electric scalar potential, evaluated in the entire problem domain that must be completely discretized for the use of the FDFD. Finite difference methods result in large sparse matrix equations that are efficiently solved by sparse matrix algebra preconditioned iterative methods. To overcome limitations imposed by structured grids in the traditional FDFD method, the new method is based upon unstructured grids allowing a better delineation of the geometries. These meshes are completely adaptable to the models we work with, promoting a smooth design of their structures, and may only be refined locally in regions of interest. We also present the development of RBF-DQ method, (radial basis function differential quadrature) which makes use of the technique of functions approximation by linear combinations of radial basis functions (RBF) and the technique of differential quadrature (DQ) for approximation of the derivatives. Our results show that the FDFD method with unstructured grids when applied to geophysical modeling problems, yield improved quality of modeled data in comparison with the results obtained by traditional techniques of FDFD method.Item Acesso aberto (Open Access) Modelagem numérica de dados MCSEM 3D usando computação paralela(Universidade Federal do Pará, 2007) SOUZA, Victor Cezar Tocantins de; RIJO, Luiz; http://lattes.cnpq.br/3148365912720676We developed the numerical modeling of Marine Controlled Source Electromagnetic (MCSEM) synthetic data used in hydrocarbon exploration for three-dimensional models using parallel computation. The models are formed of two strati ed layers: the sea and the host with a thin three-dimensional embedded reservoir overlapped by the air half-space. In this work we present a three-dimensional nite elements technique of MCSEM modeling using the primary and secondary decomposition of the magnetic and electric coupled potentials. The electromagnetic elds are calculated by numerical di erentiation of the scattered potentials. We explore the parallelism of the MCSEM 3D data in a multitransmitter survey, where as for each transmitter position we have the same forward model but with di erent data. For this, we use Message Passing Interface library (MPI) and the client server approach, where the server processor sends the input data to client processors to perform the calculation. The input data are formed by the parameters of the nite element mesh, together with informations about the transmitters and the geoeletric model of hydrocarbon reservoir with prismatic geometry. We observe that when the horizontal width and the length of the reservoir have the same order of magnitude, the in-line responses are very similar and the consequently the three-dimensional e ect is not detectable. On the other hand, when the di erence in the sizes of the horizontal width and the length of the reservoir is very large, the e ect 3D is easily detected in in-line along the biggest dimension of the reservoir. For measures done along the lesser dimension this e ect is not detectable, therefore, the model 3D approaches to a bidimensional model. The parallelism of multiple data has fast implementation and processing, and its time of execution is of the same order of the serial problem, with the addition of the latency time in the data transmission among the cluster nodes, which justifying this methodology in modeling and interpretation MCSEM data. Only simple 3D models were computed because of the reduced memory (2 Gbytes in each node) of the cluster of UFPA Departament of Geophysics.Item Acesso aberto (Open Access) Modelamento e correção de descentralização das imagens de tempo de trânsito(Universidade Federal do Pará, 2003) FISCHETTI, Anna Carmela; ANDRADE, André José Neves; http://lattes.cnpq.br/8388930487104926The imaging tools used to borehole wall features identification have been largely utilized by geologist and oil engineers to identify geological events in a open hole and inspect the casing tube. The acoustic borehole imaging tools generate a transit time image and an acoustic amplitude image that are used to this proposes. However those logs may have a non-realist interpretation, since some tools effect can negatively influence in the images appearance. This paper presents a transit time image model starting from the application of the Coulomb’s approach to the borehole wall rupture submitted to a plane state of tensions which will supply the borehole section that is the geometric form that will be mapped by the acoustic borehole imaging toll. The tool up displacement and the borehole wall imperfections are usually the responsible for the transducer displacement in relation to the borehole axis. This effect may have important responsibility in the acoustic images imperfections. Thus, a computational process of transducer repositioning in the borehole axis position obtains the correction of those images, called decentralization correction. A method of tool decentralization effect correction is presented too based on this model which is proposed based on the plane analytic geometry and in the ray method to the definition of the transit time of the acoustic pulse, with the objective of reconstruct the transit time images achieved by the decentralized tool, that is to say, correct these images becoming as they were achieved by the centralized tool in relation to the borehole axis.Item Acesso aberto (Open Access) Processamento e imageamento NMO/CRS de dados sísmicos marinhos(Universidade Federal do Pará, 2010) NUNES, Fernando Sales Martins; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work is devoted to processing and imaging of marine seismic reflection data selected from the Jequitinhonha Basin, Bahia, where two stack methods were used, the NMO-based and the CRS-based. It was also applied the CRS-partial prestack data enhancement for the densification of CMPs. Several tests were performed with these methods to optimize parameters of the stacking operators, and to improve the processing strategies. One of the efforts during processing was the attenuation of the surface-related multiple, what was attacked by applying the Radon and SRME techniques. It was also applied the densification of the data to improve the signal/noise ratio. As a conclusion, a strategy was chosen based on the comparative results of better visual and higher coherence values, having the CRS method presented superior results to the results with the NMO method, based on the visualization of reflection events that were not noticed in other sections.Item Acesso aberto (Open Access) Processamento interpretativo de dados magnetométricos e inversão de dados gravimétricos aplicados à prospecção de hidrocarbonetos(Universidade Federal do Pará, 2007) SANTOS, Darcicléa Ferreira; BARBOSA, Valéria Cristina Ferreira; http://lattes.cnpq.br/0391036221142471; SILVA, João Batista Corrêa da; http://lattes.cnpq.br/1870725463184491We present two potential field data interpretation methods applied to hydrocarbon prospecting. The first one uses aeromagnetic data to estimate the horizontal projection of the limit between the continental and oceanic crusts. This method is based on the existence of geological sources, which are magnetized and belongs exclusively to the continental crust. In this way, the estimates of the sources’ ends are used as estimates of the continental crust limits. The total-field anomaly measured above the continental shelf, continental slope and part of the continental rise is amplified using the downward continuation operator implemented in two different ways: the equivalent layer principle, and the Dirichlet boundary condition. Most of the computational load in computing the downward continued anomaly comes from the solution of a large-scale system of linear equations. This computational effort has been reduced not only by processing the whole area by moving windows of smaller dimensions, but also by the use of the conjugate gradient in the solution of the system of equations. Because the downward continuation operator is unstable, it was stabilized through the first-order Tikhonov stabilizing functional. Tests with noisecorrupted synthetic data have shown the efficiency of both implementations to enhance the termination of magnetic sources belonging to the continental crust, allowing, in this way, the estimation of the limit between the continental and oceanic crusts. Both implementations have been applied to two different areas offshore the Brazilian coast: Foz do Amazonas and Bacia do Jequitinhonha. The second method simultaneously delineates the basement topography and the geometry of salt structures occurring within the sedimentary rocks using gravity data. The interpretation models consist of a set of vertical, 2D juxtaposed prisms for the sedimentary pack, and of 2D horizontal prisms with polygonal cross sections for the salt structures. The solution has been stabilized by incorporating the geometric characteristics of the basement relief and of the salt structures, which are compatible with the a priori knowledge about the geological setting. To this end, we imposed inequality constraints to the parameters of the interpretation model, and used the stabilizing functionals known as global smoothness, weighted smoothness, and mass concentration along selected directions. We applied the method to synthetic data produced by simulated intracratonic and marginal basins, presenting density contrast with the basement varying with depth and containing salt structures. The results have shown that the method is potentially useful in simultaneously delineating the faulted basement relief and the salt structures geometries. We applied the method to real data along two gravity profiles across the Campos and Jequitinhonha Basins, and obtained interpretations in accordance with the known geology of the area.Item Acesso aberto (Open Access) Reconhecimento de fáceis em perfis geofísicos de poços com rede neural competitiva(Universidade Federal do Pará, 2015-02-27) COSTA, Jéssica Lia Santos da; ANDRADE, André José Neves; http://lattes.cnpq.br/8388930487104926The description of a depositional system based on the recognition of sedimentary facies is critical to the oil industry to characterize the petroleum system. In the absence of these facies description by cores or outcrop, we present a methodology based on intelligent algorithm able to identify facies of interest in wireline logs. This methodology uses a competitive neural network to extract geological information from the physical properties mapped in the M-N plot. The competition among neurons identifies the facies of interest, which have been previously identified in a cored borehole in other non-cored boreholes in the same oil field. The purpose of this methodology is to encode and transmit the geological information gained in cored boreholes to non-cored wells and thus achieve the geological interpretation of the facies of interest in an oil field. This methodology has been evaluated with synthetic data and actual wireline logs from two cored boreholes drilled in the Namorado oil field, Campos Basin, Brazil.Item Acesso aberto (Open Access) Regularização em estereotomografia(Universidade Federal do Pará, 2009) MELO, Luiz André Veloso; COSTA, Jessé Carvalho; http://lattes.cnpq.br/7294174204296739Obtaining an accurate velocity model is an essential part of imaging complex structures. In complex environments, conventional methods do not produce satisfactory results. Slope tomography is an effective tool for improving the velocity estimate. This method uses the slowness components and traveltimes of picked reection or difraction events for velocity model building. On the other hand, the unavoidable data incompleteness requires additional information to assure stability of inversion. One natural constraint for raybased tomography is a smooth velocity model. This study proposes to evaluate smoothness regularizations to slope tomography that require the evaluation of partial derivatives of the velocity model with respect to the spatial coordinates. One of evaluated regularizations is a new kind of smoothness constraint based on the reection angle. I evaluate results measuring data mist, velocity model results and scattering points recovered after inversion on synthetic data. In numerical tests the new constraint leads to geologically consistent models.