The Most Visited Job Board For Women For Over 20 Years
LOG IN OR REGISTER TO CONTACT ME
This button will open the login/register page in a new tab. After logging in, come back to this page and refresh your browser.
Work Summary :
A business, systems, data analyst, developer, programmer and during pandemic and periods of limited staffing, assigned role as lead
Tier-, Excel/VBA Trade Floor support person, Morgan Stanley.
Considerable RAD and traditional IT background and experience deployed, embedded with business... A liaison with Stakeholder, PMO and IT, also lead developer and programmer supporting large portfolios of sophisticated, complex Excel/VBA SQL applications that make use of ADO and ODBC connectivity with major Database DBMS’s, Oracle, SQL-Server, Sybase, SAS (IOM), DB, and MS-Access, also live data-feeds with realtime internet information providers, Bloomberg, Reuter’s, also Sharepoint, Cloud, and ETL…
I am dependable, easy-going, multi-task, and solid under pressure… Excel/VBA and Formula Expert... I assimilate fast. Adept at stepping in, filling gaps, solving problems, fitting in... Strong RCA... Comfortable working with all levels of management and organization… An affinity for detail. Strong business acumen. Solid written, speaking, power-point presentation and communication skill to include documentation and technical writing…
Company Overview:
Fortran/C/PL/JAVA as student tutor adviser)
Education:
University of Pennsylvania. Bachelor of Science, Mathematics (Quantitative Methods), Computer Science…
Application Experience:
Extensive IT and technical background as analyst, programmer, project lead working exclusively with Excel/VBA and SQL technology platforms + years. Applications include CRM, PNL(P&L), FP&A, S&OP, ERP, Forecasting and Predictive Analytics (my strengths), also Planning, Budget, Spend and Runout... Expert skill with Excel/VBA, also intricate array formula, PivotTables, PivotCharts, PowerPivots, Slicers, Spark-Lines, PowerQuery, Power-BI/DAX, HeatMaps, Dashboards, Scorecards, API, and Custom UDF Functions. Heavy experience with Shared Excel Workbooks, also workbooks with cell formula with cell references to cells that reside in other workbooks, connecting large networks of these workbooks to one another. Developed workbook applications that support the culling and filtering of reference data, sometimes many hundred thousands of rows, where use of seriously advanced performance and optimizing techniques essential in making this kind of workbook feasible, especially with the more sophisticated trading applications with live, frequent, real-time refresh-rates, eg, Bloomberg, Reuter’s to include numerous continuously refreshed URL data connections. These technical skills are current. With most recent projects here with Abbott, TIAA, and Morgan Stanley, wrote thousands of lines of VBA, and had maintained this pace of coding throughout my career, balancing project lead responsibilities with many of these projects along the way...
With Morgan Stanley and several Major Investment Banks NYC deployed, embedded with Quant and support teams sitting side-by-side with derivative (clo,cdo,cds,mbs,abs,lbs), also Fixed Income (Rates), FX and Equities traders and trading desks wearing many hats as developer, programmer, and rapid-response support person for urgent or broken Excel/VBA trading apps or modifications to live realtime pricing models, LIBOR, (re)building a yield curve, spreads, and RAD programmming initiatives. Considered a high pressure role, many high pressure moments. For me, a very fulfilling, rewarding part of my overall career and work experience.
ETL, EDI, and FTP data conversions, transformations, data mapping, data migrations w/CSV, TXT files during startup phases of new projects or ongoing feeds to and from legacy platforms, enterprise databases, the cloud, ie, handshakes to and from other systems such as SAP. ETL always an important part of every project throughout my career. Sometimes under-appreciated, these are critical, sometimes very complex processes that require sophisticated safeguards, control-reports, exception detection and sometimes suspense files that are recycled automatically with each next run to assure data quality, and data integrity...
Business Analyst
Embedded with business liaising with Stakeholders, PMO, IT and with use of SDLC (Waterfall) and ALM, authored many BRDs making presentations to senior management. Essential st steps conferring with stakeholders, business areas and project teams, building concensus, confirming that all of us are on the same page, that we fully understand what we are being asked to do... Authored many FRDs and DFDs, Use-Case drawings, with use of Agile, Jira, and Scrum boards, Visio, wearing many hats, as JAD Facilitator presenting the DFD which is the deeper study, a detailed mapping, the business model, a forest view that identifies all operational areas, data sources, information and dataflow to and from each bubble, one operational area to another, essentially this is the scope of the project... If it’s not on the drawing, it’s not in-scope, but revisions do happen. Heavily annotated lines and arrows with supporting narrative describe each exchange of data and information from bubble to bubble, to and from proposed new and existing data-store(s), one to another and if there were gaps, missing pieces, we caught them here... My goal, the final presentation, is to gain a final concensus, a key milestone,and then shifting to a higher gear developing the SRD and making the transition from SDLC Waterfall to the more traditional IT Waterfall where I switch to using JIRA, MS Project... Throughout my career I’ve gained the bandwidth and skillset to present to senior management, huddle at the whiteboard sit side-by-side with project team in many roles as developer, designer, a programmer, and everything in between.
HR/HCM Projects: Numerous projects as designer and developer assisting HR with the automation of the annual Budget, Staffing Plan, Salary Planning, Employee review/rating, and the annual employee global census collection, Europe, Asia, and the Americas. A process that begins every September. A template, essentially, an Excel workbook, one for each profit/cost center location allowing for cross border currency differences, transliterations. Also salary increase and bonus funding pool attribution process using KPI Metrics as predicates, examples would be regional profitablility, budget versus spend, fewer incident reports, amongst many other performance metrics... Developed payroll compliance reports with sensitivities to diversity and discrimination in its many forms… See Teleflex Int’l HR project details, page …
Data Analyst :
Formal study of data normalization, data modeling, data science and database design at University of Pennsylvania (a math and computer-science major), and as an IT DBA, and lead DBA with Unilever, considerable experience designing relational databases and data warehouses with considerable scale and complexity. Some examples of large database initatives, data mining, data modeling and analysis follow…
With ERP and related projects , built and performed studies with databases with millions of rows of historical data, testing for trend, cycles, interval, amplitude, seasonality, a distinguishable signal. Built many fully automated model fitting (RPA/ML) and back-testing processes that determine which of many forecasting methodologies (triangles, time-series techniques, Bayesian Linear techniques, single, double, and triple exponential smoothing (choosing optimal smoothing constant sets), where it decides which of these algorithms is best suited, most accurate, product-by-product to produce forecasts SKU specific, that feed a production planning process for multi-national company that manufactures s of millions of pounds of product each month. Knowing which products to make, when to make them, in what quantities to make them and where to inventory them (warehouse distribution) in an industry that is always capacity constrained, has direct impact on logistics costs and company’s bottom line. Having this experience, I found that the accuracy of a forecast isn’t just a function of the arithmetic one uses to produce a forecast, it’s also a function of the preparation, organization and quality of the data it feeds from (described more fully below). I developed many fully automated data modeling and model fitting techniques to identify what specific forecasting method to use in each of the varied products company produces, sku specific, where each sku exhibits unique trend, cycles, and seasonality. Please see page in particular for intricate detail of several large scale Forecasting and RPA/ML AI applications...
At EmblemHealth, a leading health insurance company, NYC, a finding from my studies with the millions of rows of claim history analyzed each month, that in order to more accurately calculate future claim reserve projections, data must first be organized in homogeneous cohorts (projection cells) of data where each grouping of insureds, the cohorts, share a similar profile, the same demographics... This is a disciplined data analysis process using methods such as chi-square, multiple regression, correlation coefficient (similar to market beta), and covariance - the trend correlation of many datasets of data and with the use of Excel/VBA, it was now possible to demonstrate that data when organized in these cohorts, with greater homogeneity, similar profiles, and with use of more advanced predictive models (triangles) that better understand medical event costs, specifically post medical event additional costs, these are considerable costs during period of recovery (the runout) previously thought not predictable; These projections are now greatly improved, more accurate, consistently produce reliable projections, a sufficient cash reserve number, a mandate with New York State Insurance Commision’s stringent requirements (they impose heavy fines) to never be under-reserved which was a major justification for this project. Please find a detailed description on following pages...
Mass Mail-Merge, Labels, Letters, eMail, Invitations, attachments, notifications, Forms, artifacts, contracts, and personalized document distribution Built numerous applications leveraging use of PDFs, PDF-ToolKit, VBScript, Excel/VBA, Microsoft Word, Outlook HTML(OLE), and sophisticated Bookmarking, using Word/VBA , sometimes VBScript to produce massive volumes of labels, also many thousands of important, time-sensitive, documents, email, invitations, and appointments with attachments and did so in a wide range of industries. eg, Cenlar Mortgage Servicing Company, NJ.
Innovation Projects.
Developed numerous Excel tools, one example, a tool that performs sophisticated workbook comparisons (compares current with prior or other like instances of the workbook in question, an RCA tool as a way to isolate/debug a broken Excel Workbook), a fully-automated tool that scans all parameters, all properties that live deeper inside these Excel workbooks, and dives even deeper insides each worksheet, and way-way deep inside each cell, the hundreds of cell properties, examples would be conditional formats, hard-coded validation lists, or pointers to validation lists in other ranges of cells on same or different worksheets, also cell-ranges that point to live data connections, also sparklines (and the cell range of the cells it feeds from) and then there are the range-names, where sometimes they point to cell ranges that live inside other workbooks (especially if other workbook accidentally is moved to a different filepath), and the more familiar properties like - locked, unlocked, color, Font, Font-size, bold, it’s endless... It’s way more than just the simple comparison of each cell formula in workbook to corresponding cell in workbook , which typically is as far as most of these types of “compare tools” go... This tool also compares every property associated with external data connections, and vba query-table URL, and ADO SQL database connections, also pivot-table properties, Chart properties, ie, a Chart’s dataseries-collection cell-range pointers, and it does all this with VBA (VBA code that looks at VBA code) looking for modifications (recent changes to vba code to include vba code that lives inside Addins, and API) that if code changes are detected, no matter how subtle or small they are, they generally are what caused or contributed to a broken, or mis-behaving and/or corrupted workbook... Also this process looks for changes to the states of Names (range-names in the menu ribbon, ie, Formula>Name Manager menu ribbon) and the cell address(es) they point too, to include external references to cells that reside in other workbooks that for any number of reasons these connections/pointers become broken. Happens alot... A note that this is a far more comprehensive tool than existing tools of this nature, even those made available from Microsoft, and believed that this tool may very well be the only tool of its kind, having this degree of complexity and capability. Has received much praise, kudos from top senior management here at Morgan Stanley... When an important workbook all of a sudden stops working, it’s almost always the result of a change to it, sometimes inadvertant, or to something it feeds from upstream of it, the example is a trading tool on the trading floor, feeding from an external source, during a high pressure trading moment... diagnosing and solving these problems with the fastest speed possible is highest priority... Was shared with me that Morgan Stanley senior management was appreciative, and that the person that developed this tool “must’ve really know their stuff” is how word of this got back to me, definitely a good moment for me...
Also with Morgan Stanley, I developed an impact anticipation tool, ahead of rollouts of updates (software refreshes) for existing Addins, API and common code specifically with Trade related workbooks, This tool walks every filepath (every folder and subfolder) all servers company-wide (looks behind every pillar and post) and with user-supplied search-criteria it examines, jumps inside each and every Excel workbook as it traverses each filePath/folder, examining each and every worksheet, looks at every cell, its value, its formula, every property deeper inside each cell.... Also every data connection, every addin, every API especially if being obsoleted or being replaced with new API, also every validation rule, conditional format, any property that might make a reference or utilize API, Addin, or common-code, and as mentioned above, with use of VBA code it looks at each workbook’s VBA, every line of code, also in every module, every event, every addin, and doing this, it compiles a listing of every Excel workbook it finds with content that matches criteria that indicates impact, again its main purpose is to be pro-Active, to identify workbooks that will be impacted with a rollout, or software refreshes of this kind (it makes a list)... Its mission is to avert a breakage resulting from an impending change to a common service, addin, or API that it and other workbooks may be using/feeding from upstream of it. These are the Excel Workbooks, especially trading tools that tend to proliferate, get copied and freely distributed throughout the many Trading Desk communities company-wide (world-wide), makes having these kinds of tools essential, especially suited for larger global Investment Banks and multi-national companies. Sometimes as trade floor support person asked to be at numerous places at same time, a one-person army (pronoun) as highest tier problem solver, essential for me to have or build tools like these... Many sophisticated tools like this I developed throughout my career. My role normally is as a developer in its many forms. But just like a volunteer fire person, I do receive the urgent calls as tier trading floor and senior management solution provider… But for me, this role is a perfect blend as developer and RAD deployment Excel/VBA go-to special projects and support person…
My Technical background, skillsets: The many acronyms of technology I work(ed) with - Advanced Excel/VBA/SQL, Macros, Power-BI, DAX, VBScript, VB, .Bat, Add-Ins/API, Rest-API, Word VBA, Heavy SQL with ADO/ODBC Database connectivity with MS Access, Salesforce, CRM, SAP/ABAP,FieldGlass, SQL-Server, SSRS/SSIS, UI/UX, UDF, Dashboards, Tableau, Sybase, Mainframe, MVS ,DB ,VSAM,CICS,COBOL, JCL, SPF, TSO, and SAS (with IOM), and most every DBMS... SAAS, Heavy JIRA, Scrum, Agile, ETL,EDI, HL data migration, FTP, with Cloud based apps (Azure, AWS). RPA/ML AI and Blue Prism. Considerable use of Hyperion for rollups and consolidation (PNL/GL), QlikView, Adobe PDF ToolKit, Cognos, XML, SOAP/Rest API, Sharepoint, Java, R, Python, VSTO, Scripting Dictionary, .Net, VB, C#, MS , MS-Project, Word, Outlook, PowerPoint, Visio, and an OOP programming style … Note: Visual Basic aka VBA and VBScript
With recent projects, any given day, you can find me at my desk coding, in meetings with stakeholders, making a presentation, or sitting side-by-side with a jr programmer debugging a program, and that same day you might find me on a trading floor fixing a broken trading app… Considered a seasoned professional, embedded with business, a low-key, team-oriented individual that can step into a high-pressure role technical and otherwise day- and hit the ground running... Please consider me for this opportunity... Thank You...
Company Specific Project Descriptions :
Abbott Labs – Chicago/NYC, Contract, Excel/VBA SQL, Oracle DBMS, Order fullfillment, Salesforce (WFH and onsite Jan , to present)
A backfill role as an Excel/VBA programmer. Abbott, well into their project determined they needed more experienced Excel developers to re-write their Order Entry, Pricing, Order Fullfillment, Supply-Chain, Shipping and Invoicing applications. They reached out to me. Initially thought Excel would be the fastest and easiest way to build this application, Abbott realized well into this project that much of the complexities of building this kind of application would need to be written with VBA code. The UI/UX (user interface) aspects of the application would be ideal for an Excel presentation layer. But there was no easy way to achieve data connectivity with the spreadsheets and Abbott’s massive enterprise database. I developed numerous API to address what was identified as system and enterprise-wide reusable code. This was a high priority application that needed to be brought online as quickly as possible to strengthen an aging order fullfillment system and supply issues in the overall marketplace. Abbott, at this time, was under considerable pressure with much publicized supply-chain issues. This was an ideal role for me as a lead programmer and project lead. Excel and VBA with ODBC connectivity to an Oracle database are a strength and this project needed to be up and running ASAP; was clearly the top priority here at Abbott...
TIAA-Cref - Contract NYC / Durham, NC (Onsite/Remote) (July to January ) Excel/VBA SQL Technology platform...
Assigned Role as analyst, developer, and programmer embeddded with Actuarial and Quant Teams developing an Excel Workbook Pension and Retirement Funds Management and Administration System. Its purpose is the managing of Retirement, Savings and Investments, and related transaction activity, examples, Derivatives, Insurance, Dividends, Annuity Surrender, Premium Payments, Untaxed Gains, Taxable Gains, Loan Withdrawals, Taxable Income, Disbursements, Distributions, Cost Basis calculations and Valuations, and a predictive cash flow and runout projection process for millions of members (UFT Teachers Union), many billions of dollars... Was asked to develop this sophisticated spreadsheet. Heavy emphasis with UI/UX features, just shy of . lines of VBA code, a fully automated refresh process with ADO/ODBC SQL connectivity to back-office Oracle databases.
Walmart (Bentonville Ar) Remote; Excel/VBA, Word/VBA during outbreaks of COVID- (May to July )
Large-scale office-automation, urgent short-term projects, with major retailer Walmart, Excel/VBA, and extensive Word/VBA (a very rare skillset ) and HTML with sophisticated Labeling and Bookmarking for mass mail-merge sending many ’s of thousands of individualized postal letters, literature to customers, buyers, vendors, business associates, employees, a real challenge given the considerable time constraints to make this happen; Also built a mechanism to send mass email via Outlook, ,+ employees and business associates, also buyers and vendors, intra-company with individualized (personal) attachments, and Outlook meeting invitations also with meeting agenda attachments and exhibits working with the Walmart (Bentonville Arkansas IT location) with one of the world’s largest IT departments, a high pressure role working with considerable time constraints during COVID- regional peaks, spikes and outbreaks that made mass tactical correspondence essential… I did this to assist Walmart IT, surprisingly less skilled in this area, to get this done… And did this type of work many times at many companies such as Cenlar, a mortgage servicing company located in Pennington, NJ and on ad-hoc basis at the many companies I worked with throughout my career.
Morgan Stanley - Midtown Manhattan, NY (September to March , WFH March to May )
Joined STRATs and FX Quant Teams as a developer/support person initially asked to build a new class of trading, pricing, spread, and yield-curve applications with emphasis on data visualization, office automation, dashboards, heatmaps, and scorecards. At the time when COVID spred through our building at Broadway, Morgan Stanley invoked their continuity plan, where employees and consultants, all of us were asked to transition, and work from home. I was assigned role as Morgan Stanley’s Tie-, Tier- Excel/VBA go-to support person, embedded with trading desks, sitting side-by-side with traders, at that time, our top priority was simply to keep our technology platforms stable, up and running. A reputation for multi-tasking with trading desks, a go-to support person that could step up, and do the ‘work of many’, my managing director shared with me during an annual review... Pre-Pandemic, some technical highlights include creation of a new class of worksheet function (an Addin with , lines of VBA) that produces sophisticated business graphics. This was considered a very special capability. Received lots of attention. Normally, a worksheet function works within the scope of the one cell it resides in. Building a dashboard, comprised of sophisticated charts, and sparklines, that are generated from a worksheet function not generally thought doable. MS’s Quant community, determined to have this capability asked me to find a way to make it happen, to have an ability to rapidly build and distribute sophisticated charts with enhanced graphics with what would look to be a simple custom worksheet function (a UDF), a formula within the cell that serves as the top left corner of the chart it just generated. We named it FastChart =FastChart(). And true with all BI dashboard tools, Excel provides for one primary and one secondary Y axis for its charts, but with what I was being asked to do, this new chart required the creation of a rd Y. I refer to it as a tertiary Y. And this required a modification, an enhancement to the Microsoft Excel Chart Object. This was a high visibility project, thought not possible to do, a request that came from senior management, to make happen (See last page discussion “why there was so much interest with this dashboard”). Was given opportunities to develop numerous custom add-ins (API), powerful tools, many of them, that attach automatically to any workbook (using VBA code that writes VBA code that it embeds inside the VBE VBProject property of the receiving workbook during the initial addin or API open event. These are tools that had become popular throughout the Morgan Stanley Quant community, also Senior Management, the Derivatives and Options Trading Desks... Also asked to participate in many urgent special projects to include BI dashboards, that feed from databases, data warehouses, local and global, via connections from within the VBA code, eg, SAS IOM, ADO ODBC connectivity to Access, SQL-Server, Oracle, DB, Sybase, the Cloud, pretty much, all of Morgan Stanley’s DBMSs... Also Pioneer, Bloomberg, and Reuters Addins that provide realtime/live ticker data... These kinds of projects and opportunities have come my way throughout my career. And probably explains why to this day, I love working with Excel and VBA. Every project is a thrill. Morgan Stanley’s PMO Team believed I would be the right person for this role after what I understand was a lengthy candidate search. Wasn’t initially hired to be a technical IT person, but was recognized as having a rare combination of strong technical Excel/VBA skill and strong business acumen and experience, and met the criteria for the person Morgan Stanley was looking for.
MFX Fairfax / Ironshore P&C Morristown, NJ (May to Sep )
MFX Fairfax provides software, and software services for property and casualty insurance industry. In this role, as business, systems, data analyst and Excel/VBA SQL developer/programmer heavy collaboration with Ironshore P&C, New York City. Working closely and embedded with Ironshore Actuarial staff; tasked with development and overhaul of large diverse set of casualty insurance risk/rating/pricing workbooks preparing them for a process that persists key policy submission, underwriting, booking, rating, pricing, artifacts and supporting documents, essentially all relevant information to a centralized enterprise database, itself in a state of development when I first came on board… Extensive modifications needed to the embedded rating and pricing functions in each of the workbooks. These are large sophisticated workbooks. They undergo considerable modification as response to steadily changing business needs. Formats and layouts vary from insurance coverage to coverage but much of the underlying key data is constant with all insurance products. The initial approach described to me during the interview simply was to perform the mapping of each cell address in a workbook to a specific database table and corresponding column-fieldname. I was asked during the interview how I might approach this project mindful of an incomplete database design with workbooks that are always in perpetual states of change. I cited some risks - starting with one example where something as simple as the insertion of a row or column in one of these workbooks, which is common-place, could easily break this process. Underscoring the value of using range-names in this project, not necessarily for all Excel based applications.. Adding that incorporating the use of range-names for key data versus a mapping of rigid non-scalable cell addresses would eliminate ongoing (re)mapping (IT support) along with other larger advantages where the persisting process could be accomplished by simply walking the (range) names collection with VBA wherein each range-name always accurately points to the actual cell to be persisted (even if cell is moved elsewhere in the workbook) which could then be fed and indeed does feed the SQL engine that updates a simplistic database table consisting of (policy-id, range-name, element-name, and value) the mapping that would serve as the linchpin, essentially a database staging area that sits between workbook and what would ultimately become the Policy Object Database. MFX liked this idea and believed I grasped the complexities of this task, and they hired me and went with the concept I outlined, and this actually simplified the overall process. Allowed database team, Ironshore actuaries, and my work to have a reduced dependency with one another in that we could independently work toward the same goal without getting in one another’s way. I developed a number of Excel based automation tools to help build this system. Entrusted with this responsibility, MFX and their client Ironshore believed I had the intuitiveness for how this system would need to be put together, an imperative for all of us, to meet the tight time-lines and milestones along the way.
TV Guide (ROVI), Philadelphia, Pa (suburb) (June , through December ) Performed work for TV Guide (ROVI), a global RAD initiative to automate receipt of television schedules and programming information sent to ROVI from local broadcasting companies from around the world in varied formats and languages. Exclusive use of Excel/VBA and SQL. All are transformed into a common format (a massive ETL process), required a massive staff doing this manually, now fully automated, a huge initiative that now feeds all of this content to TV Guide’s enterprise database, and then digitally re-distributed globally to every cable and set top box here in America, Europe, Asia-Pacific, and someday their goal, every cable and set top box, everywhere... This URL describes ROVI as being the most important company no one has ever heard of... https://www.businessinsider.com/the-most-important-media-tech-company-you-dont-know-rovi--
Bank of Tokyo, Mitsubishi Securities, NYC (February , thru early June, ) As Business, systems, data Analyst, Lead Developer. Joined team well after project was underway. Categorized as a RAD initiative, I helped FP&A management team meet aggressive target to go live with new generation of financial reports targeted for March month-end.
Validation list issues, also not a problem with Excel . Only hitch I needed to overcome was to persuade management team to do emergency upgrade from Excel to which was otherwise prohibitive this late in the project. But doing this allowed us to successfully address all showstopping issues which put us back on a glide-path to produce initial set of financial reports on time, as promised. I developed of the financial reports considered highest priority to meet this first deliverable target date.
Teleflex Medical Implant and Instrumentation Corp, Limerick, Pa (Sept , to Feb, ) As Business, Systems, Data Analyst, Developer and programmer. With Excel/VBA, SQL and ODBC connection to Microsoft Access DBMS via ADO, I was asked to develop an HRIS/HCM Human Resource Total Compensation Mgmt application. One of the more challenging ETL initiatives in my career. This is an Employee Review, Headcount, Compensation and Bonus planning tool. Teleflex is a multi-national company with ,+ employees during the timeframe of this project. SAP HRIS module in early development, not available for their annual review, compensation, and bonus planning process, so Teleflex went with an Excel/Access interim alternative.
EmblemHealth Insurance Companies - Water St, Lower Manhattan, NY (Jan to Sep , ) As Business, Data Analyst / Developer. Excel, VBA, ActiveX Controls, ADO, SQL, Hyperion, Ms Access, Oracle/SAS/IOM (SAS proprietary ODBC equivalent).
JPM/Bear Stearns - Midtown Manhattan, NY (Sept to Jan ) As Business, systems, data Analyst, Developer
Technologies: Bloomberg, Reuter’s, MS Access, Excel, VBA (macros), ADO, CDO, SQL, SQL-Server, Sybase, Oracle, SharePoint, VB.Net.
Deployed directly with Front Office, Middle Office, Back Office, Trading, Marketing, Equities and Derivatives Teams.
Citigroup, Manhattan Midtown Manhattan, NYC (Jan to Sept ) Sr. Business Analyst / Lead Developer (backfilled position to fill a staffing gap mid-project) Technologies: Excel, VBA (macros), ADO, CDO, Pivots, Dashboards, SQL, Access, SQL, SQL-Server, Oracle...
Ingredion, Imperial Chemical Company, ICI - BridgeWater, NJ Sr. Business, Systems, Data Analyst / Designer / Developer.
Excel, VBA (macros), (Power) Pivots, SQL-Server, Sybase, Oracle, SAP, VB.Net... As consultant and lead Excel/VBA developer. A go to person for Excel and VBA assistance with all operational areas. Conducted numerous advanced Excel training classes and lunch-and-learns for employees. Developed many Excel/VBA/ SQL applications that feed from Access, Oracle, Sybase, and SQLServer using ADO/ODBC. Developed a Budget and performance tracking System using Excel, VBA, Access, and an ETL process with feeds to and from SAP. A joint project with HRIS and IT, a complete Global IT budgeting process to include salary planning, vendor management, also equipment, software licensing, licensing other, leasing, and departmental chargebacks. A system unto itself, an employee review process, employee census, with new-hire planning. Dozens of separate departmental budget planning spreadsheet templates built with interfaces to G/L and COA, each worksheet, a template providing for FTE, employee rating, bonus compensation rules, multiple levels of pro-ration in part predicated on pre-defined KPI parameters. Bonuses determined based upon each operational areas contribution to profit. Templates perform local and US Currency transliteration. Spreadsheets reside on shared servers. Rollup and Consolidation of budget items to G/L accounts where each departmental budget spreadsheet content is additionally consolidated (rolled up) to corresponding business entities spreadsheets. All business entity spreadsheets are then consolidated, a rollup to a corporate spreadsheet. Monthly Budget-versus-Spend reports using a dashboard format with clickable graphics for drilldown to underlying departmental data, Heavy emphasis on UI/UX. Obhjective is to identify overspend and underspend and with simulation project year-end impact. Dashboard designed wherein a click on a datapoint shows underlying data possibly answeing the question of what caused the anomaly, the skew of actual versus bdget. Designed for CIO, CTO, CFO, and CEO, and re-enforces the value of a UI and the UX …
Henkel, Unilever, National Starch and Chemical Company, NSC , BridgeWater, NJ
As Senior Analyst, ERP Architect / Developer embedded with operational areas to include Supply and Demand Teams.
Developed RPA early-warning capability (aka, eWarning) with higher functioning custom UDF Excel VBA functions, and SQL with ADO/ODBC connectivity to Oracle to acquire historical sales data, inventory, and factory production schedules and from a Data Warehouse. performs ETL transfers of CSV files, ie, feeds from SAP to this new system. This app has an ability to detect breaks in ordering patterns. This capability was named “eWarning”. Its job was to alert the planning community, and customer service (CSRs) to reach out to a customer (customer refers to big Fortune companies) when an order is not received when expected. In the ABC analysis of things, these are typically the ‘A’ customers, ie, the % of customers that account for % of Unilever NSC’s business. Almost always at most inopportune times, a customer or its systems might fail to place orders in timely manner for any number of reasons. Much of the time, the Unilever NSC CSR (the customer service rep) reaches out to the customer and gets the order . Outcomes are fed back to demand and supply teams and forecasts are revised as needed. I.e., if customer chooses to not place an order. Anecdotal stories from CSR recount how sometimes customer had simply forgotten to submit an order and very happy that NSC could know this... Or the customer changed their production schedule and failed to communicate this with NSC. This was a real bonus for NSC. The consequence of a customer that fails to place an order can shut-down customer’s own operations especially those that operate with a just-in-time ordering strategy… Bad for them and bad for NSC… Re-tooling NSC factory to make unscheduled product is expensive. Each category of product, its composition is different, causes factory down-time, costly time-consuming equipment retooling, reconfiguration, to re-stage the factory and suppliers to stage different categories of product. Not easy to make changes to production schedule without major disruption to NSC’s factory output . And this underscores the importance of an accurate forecast as a key to efficient use of manufacturing capacity, ie, production planning and capacity planning.. Early Warning logic makes heavy use of interval, amplitude, order rate and frequency.
The initial concept, my reasoning for the importance of eWarning, I was asked to explain or demonstrate its value to senior management, demonstrate its’ value (value-added) to the company, and prove that it will work. So I worked ‘round the clock to create what became known as a concept car, a working Excel application known as the concept-car the reason it was so successful, far exceeded everyone’s expectations. This thing I had developed as the Unilever NSC lead architect, initially its purpose had nothing to do with the overall efficiencies that were realized with the many business areas that benefited from it... I built It to protect the forecast in the event that assumptions that went into producing the forecast hadn’t materialized. I invented eWarning to protect the forecasts, and because I led the project that built the forecasting system. Jokingly I’d say my reputation was at stake. This was my sense of humor, but in truth, eWarning raises a “heads-up” giving planners the time to react and hopefully meaningfully re-use what otherwise would’ve been seen as unused or unspent manufacturing capacity… An unforced error. The eWarning app studies every customers’ ordering and shipment patterns (their typical lead-time when placing orders), and doing so, it became a tactical but also strategically important operational tool, and a competitive advantage for NSC… It warns planners and customer service with sufficient lead time, and averts what are big logistics problems that had come to be accepted as the cost of doing business. This Excel workbook filled a gap that know one knew existed... Solved a problem that was presumed not solvable. NSC-IT unable to make a business case to recreate this process with their conventional technology platforms; Recalling IT said - to reproduce what eWarning (the Excel workbook) could do would be a Year effort (a staff of might be able to complete the project in to years). IT, not considered a fan of Excel, not positioned to take this task on, and unprecedented, IT embraced this Excel App, and so began a growing respect throughout IT for Excel and VBA. To predict order date, eWarning, uses a form of rate/time (rate over time) to estimate/anticipate a next order-date datapoint, with use of algorithm detection. It studies order interval and amplitude (when to expect orders, the frequency, and in what amounts, the amplitude. We are talking about tank trucks and railcar sized shipments of product or barrels or tote-bins. Not getting it right is costly. But I built eWarning for selfish reasons I would say... I was the architect of NSC’s Forecasting system. I had faith in this forecasting to always produce a sensible forecast so I built eWarning to keep an eye on NSC customer buying patterns, in the event their ordering patterns changed, or something as simple as forgetting to place an order occurred, it would be costly, both for NSC and the customer possibly forcing shut down to reconfigure factories (costly depending on varying equipment needs for each production run). eWarning would in fact reliably raise the alert giving the planner(s) sufficient time to adjust schedules with little to no disruption to factory output.. I built eWarning to protect the forecasts. :) SAP to this day cannot support these capabilities. Forecasting is the Demand Planning team’s primary planning tool. It drives when to make product, how much to make and where most efficiently in what regional warehouse or cross-dock to place product for it’s final leg of journey to the customer. Forecasting produces a forecast, checks inventory positions, where the inventory is located in proximity to the customer, adjusts production output to be net of current inventory positions. Transportation and shipping costs are expensive, even moreso now. Especially at this point in time. Not efficiently and logistcally storing/warehousing product geographically, means added substantial shipping costs. This was a great project. Embedded with Demand and Supply teams and I was accepted as being one of them. Part of their team. This meant the world to me…
I have an extensive background in Forecasting, Predictive Analytics, Marketing, Metrics, Inference Engines, and Econometric Modeling as developer, programmer, all on Excel/VBA technology platforms and with use of RPA/ML and AI automation that drive model-fitting, ie, the running of thousands of simulations where it, on it’s own, decides best algorithm choosing from Time-Series, Simple Regression, Multiple Regression, Bayesean Linear techniques, Single, Double, and Triple Exponential Smoothing where by itself, RPA drives a secondary process that determines optimal alpha, beta, and gamma degrees of dampening, and sometimes it determines “simple is better”, that a moving average, weighted moving, or simple average is best, as this is RPA’s way of punting when no algorithm, signal, or method exists within the data it is working with... Using aforementioned algorithms, it iteratively runs these simulations for each of the thousands of sets of historical data it works with, and chooses the one simulation, the winning model, the one algorithm that wins the contest from the thousand plus simulations, a thousand plus passes at each set of data, where RPA chooses best algorithm, the one simulation that most closely mirrors the signature, the signal of the most recent months of historical data, a proxy, that represents the future, essentially the baseline for the purpose of doing these simulations. Essentially, it uses a process commonly known as back-testing, where it looks back in time at or more years of historical data (the actuals), then feeds from the oldest of the years, in this case, data coming from SAP via ETL handshake. RPA puts this data through its paces, the simulations, using every model and smoothing constant combination (+((+)*)) + (**)) =, combinations and using these accuracy metrics, MAPE, APE, MAD, StdDev ( sigma tolerance), and least squares MSE (derived from a sum of squared errors) and a method I invented, Degrees of Accuracy, it chooses the simulation, ie, the one that digitally demonstrates the smallest variance, i.e., where comparing its signal with the signal of the most recent months of historical data (the proxy representing the future), both curves, the proxy and the winning simulation, their trend, slope, every bump, cycle, and seasonal feature should appear to move in lockstep, collinear… And voila’, this is Model-fitting. This process almost never gets fooled with noisy data, an errant bump (the anomaly) that inevitably exist in historical data, i.e, the one-off events… The project to do this was deemed necessary as SAP forecasts not sufficiently sophisticated to be able to achieve an acceptable degree of accuracy… Each month this process self-evaluates itself, A-F/A, how it did in the previous month(s), it measures its own accuracy; Produces a report for its administrator, a scorecard, and it re-modelfits those data-sets where RPA thinks it can do a better forecast going forward... I have built many variations of this methodology dating to my days at U of P, spending considerable time at University City Science Center at Market St, the West Philadelphia campus…
I developed several Large Scale Automotons... One of many projects working with RPA/ML and AI as the developer. This particular application, distinguishes itself from other current-state RPA, a true automoton, a snap-on tool that attaches to existing workbooks, eg, trading apps that feed from live data, local web-farms, Bloomberg, Reuters, perhaps numerous URL amongst the many realtime internet information providers and does so exclusively on Excel/VBA SQL technology platforms. Several thousand lines of VBA code. Designed to self-install; attach to existing Excel apps. Essentially an Addin, an xlam that self-installs in a matter of seconds. Uses VBA code that writes VBA code it embeds with VBE inside the receiving workbook’s activate event during the robot addin’s (xlam) open event. This app Self-Study's live data using ANOVA, Qualitative, Quantitative, Algorithmic, Stochastic, and all best known Statistical Methods. This Addin is a true robot that operates as an employee. Essentially with Instantiation, it is replicatable, scalable General Staffing, Quants, Analysts, MiddleOffice staff, where each instance’s role, as one example, is to monitor an Excel desktop looking at live ticker data with an ability to spot what it determines is an actionable event, eg, a trading opportunity... The advantage versus human is this app never blinks, no bathroom breaks, a workaholic… Learns, study’s on its own "what normal looks like". It gains perspective; A context; Essentially it studies every digital signature, every change with data; gains an innate understanding of normal and the ability to Identify anomaly, a datapoint or series that exceeds sigma tolerance, an outlier that deviates outside of what it knows is normal variation, ergo a potential actionable event, ie, a new normal moment. Equipped with this ability, It communicates observed event instantly to SMEs, stake-holders, actionaries (the decision makers), the traders, with relevant charts, essentially snapshots of cells (before/after), and artifacts that tell the story via email, of what triggered the alert. It does this with use of HTML and Outlook object embedded in its VBA via OLE. Seriously advanced VBA that makes heavy use of Intersect() functions (this is just one example) to include jumping inside the data-series collection(s), properties that are hidden deeper inside an Excel chart or graph object. These collections are pointers that tell the chart where in what cells to find the data it feeds on. But this also helps the robot to know which charts are relevant to the story. Snapshots of these charts, graphs, dashboards (the artifacts), are sent either as attachments or embedded inside these heads-up email alerts, and does this in fractions of seconds, ie. sending this email to the actionary, a trader, a managing director, and/or it can send a pulse to a backoffice mechanism to make the trade, or initiate the trade by itself, generally speaking it alerts the actionary, in this case a backoffice trading app, or sends text message or email to responsible persons...
Earlier Experience
University of Penna (DRL/UCSC, Market St, Phila, Math Dept (as student, assistant to Math head professor C. Bergey, Fortran/C/PL/VB/JAVA student advisor tutor (mentor) onsite student support at University City Science Center (Student’s Computer Center)…